Some properties of Rényi entropy over countably infinite alphabets (Q383169): Difference between revisions
From MaRDI portal
Created a new Item |
Changed an Item |
||
Property / review text | |||
Let \(\alpha \geq 0\) be arbitrarily fixed, then the Rényi entropy corresponding to the probability distribution \(\mathcal{P}=(p_{1}, \ldots, p_{N})\) is defined as \[ H_{\alpha}(\mathcal{P})=\frac{1}{1-\alpha}\sum_{n=1}^{N}p_{n}^{\alpha}, \] further \[ H_{1}(\mathcal{P})= \lim_{\alpha \to 1}H_{\alpha}(\mathcal{P})=-\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Rényi entropies are also frequently used when the alphabet is countably infinite (e.g. statistical physics). In such a situation the Rényi entropy of order \(\alpha\) of the probability distribution \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) is defined as \[ H_{\alpha}(\mathcal{P}) = \frac{1}{1-\alpha}\sum_{n=1}^{\infty}p_{n}^{\alpha} \] and \[ H_{1}(\mathcal{P}) -\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Clearly, in case the probability distribution is finitely supported, then Rényi entropies always exist. However, in case of distributions with countably infinite masses, the problem of divergence occurs. The authors therefore introduce the (Rényi) critical exponent as \[ \alpha_{c}(\mathcal{P})= \inf\left\{\alpha \geq 0 \mid H_{\alpha}(\mathcal{P})\leq \infty\right\} \] and the (Rényi) region of convergence as \[ \mathcal{R}(\mathcal{P})= \left\{\alpha \geq 0\mid H_{\alpha}(\mathcal{P})<\infty\right\}. \] The main result of the second section is that for any probability distribution \(\mathcal{P}\) the function \[ \alpha \longmapsto H_{\alpha}(\mathcal{P}) \] is continuous on \(\mathcal{R}(\mathcal{P})\). Furthermore, if \(H_{\alpha_{c}}(\mathcal{P})=\infty\), then \(\lim_{\alpha\to \alpha_{c}}H_{\alpha}(\mathcal{P})=\infty\). In the third section continuity in argument \(\mathcal{P}\) is dealt with. Here, the authors prove (among others) that for \(\alpha >1\) the mapping \[ \mathcal{P}\longmapsto H_{\alpha}(\mathcal{P}) \] is continuous, while for \(\alpha \geq 1\) it is discontinuous. The main result of the paper is contained at the end of the fourth section, in Theorem 8, which reads as follows. Let \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) be a probability distribution and \(r\in [0, + \infty]\). Then there exists a sequence of distributions \((\mathcal{P}_{n})_{n\in \mathbb{N}}\) converging to \(\mathcal{P}\) in the topology determined by the total variation distance so that \[ \lim_{n\to \infty}\lim_{\alpha \to 1+}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P})+r \] and at the same time \[ \lim_{\alpha \to 1+}\lim_{n\to \infty}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P}). \] Finally, in the fifth section, the limiting case \(\alpha \to +\infty\) is studied. | |||
Property / review text: Let \(\alpha \geq 0\) be arbitrarily fixed, then the Rényi entropy corresponding to the probability distribution \(\mathcal{P}=(p_{1}, \ldots, p_{N})\) is defined as \[ H_{\alpha}(\mathcal{P})=\frac{1}{1-\alpha}\sum_{n=1}^{N}p_{n}^{\alpha}, \] further \[ H_{1}(\mathcal{P})= \lim_{\alpha \to 1}H_{\alpha}(\mathcal{P})=-\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Rényi entropies are also frequently used when the alphabet is countably infinite (e.g. statistical physics). In such a situation the Rényi entropy of order \(\alpha\) of the probability distribution \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) is defined as \[ H_{\alpha}(\mathcal{P}) = \frac{1}{1-\alpha}\sum_{n=1}^{\infty}p_{n}^{\alpha} \] and \[ H_{1}(\mathcal{P}) -\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Clearly, in case the probability distribution is finitely supported, then Rényi entropies always exist. However, in case of distributions with countably infinite masses, the problem of divergence occurs. The authors therefore introduce the (Rényi) critical exponent as \[ \alpha_{c}(\mathcal{P})= \inf\left\{\alpha \geq 0 \mid H_{\alpha}(\mathcal{P})\leq \infty\right\} \] and the (Rényi) region of convergence as \[ \mathcal{R}(\mathcal{P})= \left\{\alpha \geq 0\mid H_{\alpha}(\mathcal{P})<\infty\right\}. \] The main result of the second section is that for any probability distribution \(\mathcal{P}\) the function \[ \alpha \longmapsto H_{\alpha}(\mathcal{P}) \] is continuous on \(\mathcal{R}(\mathcal{P})\). Furthermore, if \(H_{\alpha_{c}}(\mathcal{P})=\infty\), then \(\lim_{\alpha\to \alpha_{c}}H_{\alpha}(\mathcal{P})=\infty\). In the third section continuity in argument \(\mathcal{P}\) is dealt with. Here, the authors prove (among others) that for \(\alpha >1\) the mapping \[ \mathcal{P}\longmapsto H_{\alpha}(\mathcal{P}) \] is continuous, while for \(\alpha \geq 1\) it is discontinuous. The main result of the paper is contained at the end of the fourth section, in Theorem 8, which reads as follows. Let \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) be a probability distribution and \(r\in [0, + \infty]\). Then there exists a sequence of distributions \((\mathcal{P}_{n})_{n\in \mathbb{N}}\) converging to \(\mathcal{P}\) in the topology determined by the total variation distance so that \[ \lim_{n\to \infty}\lim_{\alpha \to 1+}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P})+r \] and at the same time \[ \lim_{\alpha \to 1+}\lim_{n\to \infty}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P}). \] Finally, in the fifth section, the limiting case \(\alpha \to +\infty\) is studied. / rank | |||
Normal rank | |||
Property / reviewed by | |||
Property / reviewed by: Eszter Gselmann / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 94A17 / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 94A15 / rank | |||
Normal rank | |||
Property / zbMATH DE Number | |||
Property / zbMATH DE Number: 6232254 / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
Rényi entropies | |||
Property / zbMATH Keywords: Rényi entropies / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
Shannon entropy | |||
Property / zbMATH Keywords: Shannon entropy / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
countably infinite alphabet | |||
Property / zbMATH Keywords: countably infinite alphabet / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
Rényi critical exponent, Rényi region of convergence | |||
Property / zbMATH Keywords: Rényi critical exponent, Rényi region of convergence / rank | |||
Normal rank |
Revision as of 12:13, 29 June 2023
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Some properties of Rényi entropy over countably infinite alphabets |
scientific article |
Statements
Some properties of Rényi entropy over countably infinite alphabets (English)
0 references
25 November 2013
0 references
Let \(\alpha \geq 0\) be arbitrarily fixed, then the Rényi entropy corresponding to the probability distribution \(\mathcal{P}=(p_{1}, \ldots, p_{N})\) is defined as \[ H_{\alpha}(\mathcal{P})=\frac{1}{1-\alpha}\sum_{n=1}^{N}p_{n}^{\alpha}, \] further \[ H_{1}(\mathcal{P})= \lim_{\alpha \to 1}H_{\alpha}(\mathcal{P})=-\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Rényi entropies are also frequently used when the alphabet is countably infinite (e.g. statistical physics). In such a situation the Rényi entropy of order \(\alpha\) of the probability distribution \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) is defined as \[ H_{\alpha}(\mathcal{P}) = \frac{1}{1-\alpha}\sum_{n=1}^{\infty}p_{n}^{\alpha} \] and \[ H_{1}(\mathcal{P}) -\sum_{n=1}^{N}p_{n}\log(p_{n}). \] Clearly, in case the probability distribution is finitely supported, then Rényi entropies always exist. However, in case of distributions with countably infinite masses, the problem of divergence occurs. The authors therefore introduce the (Rényi) critical exponent as \[ \alpha_{c}(\mathcal{P})= \inf\left\{\alpha \geq 0 \mid H_{\alpha}(\mathcal{P})\leq \infty\right\} \] and the (Rényi) region of convergence as \[ \mathcal{R}(\mathcal{P})= \left\{\alpha \geq 0\mid H_{\alpha}(\mathcal{P})<\infty\right\}. \] The main result of the second section is that for any probability distribution \(\mathcal{P}\) the function \[ \alpha \longmapsto H_{\alpha}(\mathcal{P}) \] is continuous on \(\mathcal{R}(\mathcal{P})\). Furthermore, if \(H_{\alpha_{c}}(\mathcal{P})=\infty\), then \(\lim_{\alpha\to \alpha_{c}}H_{\alpha}(\mathcal{P})=\infty\). In the third section continuity in argument \(\mathcal{P}\) is dealt with. Here, the authors prove (among others) that for \(\alpha >1\) the mapping \[ \mathcal{P}\longmapsto H_{\alpha}(\mathcal{P}) \] is continuous, while for \(\alpha \geq 1\) it is discontinuous. The main result of the paper is contained at the end of the fourth section, in Theorem 8, which reads as follows. Let \(\mathcal{P}=(p_{1}, p_{2}, \ldots)\) be a probability distribution and \(r\in [0, + \infty]\). Then there exists a sequence of distributions \((\mathcal{P}_{n})_{n\in \mathbb{N}}\) converging to \(\mathcal{P}\) in the topology determined by the total variation distance so that \[ \lim_{n\to \infty}\lim_{\alpha \to 1+}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P})+r \] and at the same time \[ \lim_{\alpha \to 1+}\lim_{n\to \infty}H_{\alpha}(\mathcal{P}_{n})=H_{1}(\mathcal{P}). \] Finally, in the fifth section, the limiting case \(\alpha \to +\infty\) is studied.
0 references
Rényi entropies
0 references
Shannon entropy
0 references
countably infinite alphabet
0 references
Rényi critical exponent, Rényi region of convergence
0 references