Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions (Q2393890)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions |
scientific article; zbMATH DE number 3202903
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions |
scientific article; zbMATH DE number 3202903 |
Statements
Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions (English)
0 references
1959
0 references
Let \((R, S, m)\) be a \(\sigma\)-finite measure space. \(V\) denotes the class of all absolutely continuous probability distributions \((X, X', X_n,\dots,\) say) with the densities \((dX/dm=p(x)\), \(dX'/dm = p'(x)\), \(dX_n/dm = p_n(x),\dots,\) say). The uniform metric in \(V\) is defined by \(d(X,X') = \text{ess.}\sup | p(x) - p'(x)|\), and the entropy of each \(X\in V\) is defined by \(H(X) = \int p(x) \log p(x) \,dm\). The author proves that: (1) When \(m(R)< \infty\), \(d(X_n,X)\to 0\) implies \(H(X_n) \to H(X)\) \((n\to\infty)\). (2) When \(m(R) = \infty\), if \(X\in V\) is mutually absolutely continuous with respect to \(m\), \(d(X_n,X)\to 0\) and \(\text{ess.} \sup_{p(x)>0} | 1 - p_n(x)/p(x)| \to 0\) \((n\to\infty)\) then \(H(X_n)\to H(X)\). A characterization theorem of the entropy \(H(X)\) of any continuously valued finite dimensional random variable \(X\) has been previously proved by \textit{H. Hatori} [Kodai Math. Semin. Rep. 10, 172--176 (1958; Zbl 0087.33201)]. The present author treats a similar characterization of \(H(X)\) in the abstract \(\sigma\)-finite measure space \((R, S, m)\), which is based upon the preceding considerations (1), (2) and two additional properties corresponding to the entropy of the uniform distribution and the conditional entropy.
0 references
Shannon-Wiener information measure
0 references
continuous probability distributions
0 references
0.7804403901100159
0 references
0.7783414721488953
0 references
0.7763702869415283
0 references