Some properties of Rényi entropy over countably infinite alphabets (Q383169): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Importer (talk | contribs)
Changed an Item
(3 intermediate revisions by 2 users not shown)
Property / author
 
Property / author: Mladen Kovačević / rank
Normal rank
 
Property / author
 
Property / author: Ivan Stanojević / rank
Normal rank
 
Property / author
 
Property / author: Vojin Šenk / rank
Normal rank
 
Property / author
 
Property / author: Mladen Kovačević / rank
 
Normal rank
Property / author
 
Property / author: Ivan Stanojević / rank
 
Normal rank
Property / author
 
Property / author: Vojin Šenk / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1106.5130 / rank
 
Normal rank

Revision as of 14:13, 18 April 2024

scientific article
Language Label Description Also known as
English
Some properties of Rényi entropy over countably infinite alphabets
scientific article

    Statements

    Some properties of Rényi entropy over countably infinite alphabets (English)
    0 references
    25 November 2013
    0 references
    Let \(\alpha \geq 0\) be arbitrarily fixed, then the Rényi entropy corresponding to the probability distribution \(\mathcal{P}=(p_{1}, \ldots, p_{N})\) is defined as \[ H_{\alpha}(\mathcal{P})=\frac{1}{1-\alpha}\sum_{n=1}^{N}p_{n}^{\alpha}, \] further \[ H_{1}(\mathcal{P})= \lim_{\alpha \to 1}H_{\alpha}(\mathcal{P})=-\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Rényi entropies are also frequently used when the alphabet is countably infinite (e.g. statistical physics). In such a situation the Rényi entropy of order \(\alpha\) of the probability distribution \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) is defined as \[ H_{\alpha}(\mathcal{P}) = \frac{1}{1-\alpha}\sum_{n=1}^{\infty}p_{n}^{\alpha} \] and \[ H_{1}(\mathcal{P}) -\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Clearly, in case the probability distribution is finitely supported, then Rényi entropies always exist. However, in case of distributions with countably infinite masses, the problem of divergence occurs. The authors therefore introduce the (Rényi) critical exponent as \[ \alpha_{c}(\mathcal{P})= \inf\left\{\alpha \geq 0 \mid H_{\alpha}(\mathcal{P})\leq \infty\right\} \] and the (Rényi) region of convergence as \[ \mathcal{R}(\mathcal{P})= \left\{\alpha \geq 0\mid H_{\alpha}(\mathcal{P})<\infty\right\}. \] The main result of the second section is that for any probability distribution \(\mathcal{P}\) the function \[ \alpha \longmapsto H_{\alpha}(\mathcal{P}) \] is continuous on \(\mathcal{R}(\mathcal{P})\). Furthermore, if \(H_{\alpha_{c}}(\mathcal{P})=\infty\), then \(\lim_{\alpha\to \alpha_{c}}H_{\alpha}(\mathcal{P})=\infty\). In the third section continuity in argument \(\mathcal{P}\) is dealt with. Here, the authors prove (among others) that for \(\alpha >1\) the mapping \[ \mathcal{P}\longmapsto H_{\alpha}(\mathcal{P}) \] is continuous, while for \(\alpha \geq 1\) it is discontinuous. The main result of the paper is contained at the end of the fourth section, in Theorem 8, which reads as follows. Let \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) be a probability distribution and \(r\in [0, + \infty]\). Then there exists a sequence of distributions \((\mathcal{P}_{n})_{n\in \mathbb{N}}\) converging to \(\mathcal{P}\) in the topology determined by the total variation distance so that \[ \lim_{n\to \infty}\lim_{\alpha \to 1+}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P})+r \] and at the same time \[ \lim_{\alpha \to 1+}\lim_{n\to \infty}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P}). \] Finally, in the fifth section, the limiting case \(\alpha \to +\infty\) is studied.
    0 references
    0 references
    Rényi entropies
    0 references
    Shannon entropy
    0 references
    countably infinite alphabet
    0 references
    Rényi critical exponent, Rényi region of convergence
    0 references
    0 references
    0 references
    0 references
    0 references