Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions (Q2393890): Difference between revisions
From MaRDI portal
ReferenceBot (talk | contribs) Changed an Item |
Set OpenAlex properties. |
||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1007/bf01737401 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2037790254 / rank | |||
Normal rank |
Latest revision as of 09:56, 30 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions |
scientific article |
Statements
Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions (English)
0 references
1959
0 references
Let \((R, S, m)\) be a \(\sigma\)-finite measure space. \(V\) denotes the class of all absolutely continuous probability distributions \((X, X', X_n,\dots,\) say) with the densities \((dX/dm=p(x)\), \(dX'/dm = p'(x)\), \(dX_n/dm = p_n(x),\dots,\) say). The uniform metric in \(V\) is defined by \(d(X,X') = \text{ess.}\sup | p(x) - p'(x)|\), and the entropy of each \(X\in V\) is defined by \(H(X) = \int p(x) \log p(x) \,dm\). The author proves that: (1) When \(m(R)< \infty\), \(d(X_n,X)\to 0\) implies \(H(X_n) \to H(X)\) \((n\to\infty)\). (2) When \(m(R) = \infty\), if \(X\in V\) is mutually absolutely continuous with respect to \(m\), \(d(X_n,X)\to 0\) and \(\text{ess.} \sup_{p(x)>0} | 1 - p_n(x)/p(x)| \to 0\) \((n\to\infty)\) then \(H(X_n)\to H(X)\). A characterization theorem of the entropy \(H(X)\) of any continuously valued finite dimensional random variable \(X\) has been previously proved by \textit{H. Hatori} [Kodai Math. Semin. Rep. 10, 172--176 (1958; Zbl 0087.33201)]. The present author treats a similar characterization of \(H(X)\) in the abstract \(\sigma\)-finite measure space \((R, S, m)\), which is based upon the preceding considerations (1), (2) and two additional properties corresponding to the entropy of the uniform distribution and the conditional entropy.
0 references
Shannon-Wiener information measure
0 references
continuous probability distributions
0 references