Some properties of Rényi entropy over countably infinite alphabets (Q383169): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: On the Foundations of Information Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: A coding theorem and Rényi's entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: On a coding theorem connected with Rényi's entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Renyi's entropy and the probability of error / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized cutoff rates and Renyi's information measures / rank
 
Normal rank
Property / cites work
 
Property / cites work: The world according to Rényi: Thermodynamics of multifractal systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some properties of Rényi entropy and Rényi entropy rate / rank
 
Normal rank
Property / cites work
 
Property / cites work: Rényi entropy rate for Gaussian processes / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Discontinuity of the Shannon Information Measures / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Interplay Between Entropy and Variational Distance / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Information Divergence Measures and a Unified Typicality / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Interplay Between Conditional Entropy and Error Probability / rank
 
Normal rank
Property / cites work
 
Property / cites work: Universal Coding on Infinite Alphabets: Exponentially Decreasing Envelopes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Coding on Countably Infinite Alphabets / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4115333 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4340161 / rank
 
Normal rank

Latest revision as of 02:59, 7 July 2024

scientific article
Language Label Description Also known as
English
Some properties of Rényi entropy over countably infinite alphabets
scientific article

    Statements

    Some properties of Rényi entropy over countably infinite alphabets (English)
    0 references
    25 November 2013
    0 references
    Let \(\alpha \geq 0\) be arbitrarily fixed, then the Rényi entropy corresponding to the probability distribution \(\mathcal{P}=(p_{1}, \ldots, p_{N})\) is defined as \[ H_{\alpha}(\mathcal{P})=\frac{1}{1-\alpha}\sum_{n=1}^{N}p_{n}^{\alpha}, \] further \[ H_{1}(\mathcal{P})= \lim_{\alpha \to 1}H_{\alpha}(\mathcal{P})=-\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Rényi entropies are also frequently used when the alphabet is countably infinite (e.g. statistical physics). In such a situation the Rényi entropy of order \(\alpha\) of the probability distribution \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) is defined as \[ H_{\alpha}(\mathcal{P}) = \frac{1}{1-\alpha}\sum_{n=1}^{\infty}p_{n}^{\alpha} \] and \[ H_{1}(\mathcal{P}) -\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Clearly, in case the probability distribution is finitely supported, then Rényi entropies always exist. However, in case of distributions with countably infinite masses, the problem of divergence occurs. The authors therefore introduce the (Rényi) critical exponent as \[ \alpha_{c}(\mathcal{P})= \inf\left\{\alpha \geq 0 \mid H_{\alpha}(\mathcal{P})\leq \infty\right\} \] and the (Rényi) region of convergence as \[ \mathcal{R}(\mathcal{P})= \left\{\alpha \geq 0\mid H_{\alpha}(\mathcal{P})<\infty\right\}. \] The main result of the second section is that for any probability distribution \(\mathcal{P}\) the function \[ \alpha \longmapsto H_{\alpha}(\mathcal{P}) \] is continuous on \(\mathcal{R}(\mathcal{P})\). Furthermore, if \(H_{\alpha_{c}}(\mathcal{P})=\infty\), then \(\lim_{\alpha\to \alpha_{c}}H_{\alpha}(\mathcal{P})=\infty\). In the third section continuity in argument \(\mathcal{P}\) is dealt with. Here, the authors prove (among others) that for \(\alpha >1\) the mapping \[ \mathcal{P}\longmapsto H_{\alpha}(\mathcal{P}) \] is continuous, while for \(\alpha \geq 1\) it is discontinuous. The main result of the paper is contained at the end of the fourth section, in Theorem 8, which reads as follows. Let \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) be a probability distribution and \(r\in [0, + \infty]\). Then there exists a sequence of distributions \((\mathcal{P}_{n})_{n\in \mathbb{N}}\) converging to \(\mathcal{P}\) in the topology determined by the total variation distance so that \[ \lim_{n\to \infty}\lim_{\alpha \to 1+}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P})+r \] and at the same time \[ \lim_{\alpha \to 1+}\lim_{n\to \infty}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P}). \] Finally, in the fifth section, the limiting case \(\alpha \to +\infty\) is studied.
    0 references
    0 references
    Rényi entropies
    0 references
    Shannon entropy
    0 references
    countably infinite alphabet
    0 references
    Rényi critical exponent, Rényi region of convergence
    0 references
    0 references
    0 references
    0 references
    0 references