Entropy-variance inequalities for discrete log-concave random variables via degree of freedom (Q6080139)
From MaRDI portal
scientific article; zbMATH DE number 7756947
Language | Label | Description | Also known as |
---|---|---|---|
English | Entropy-variance inequalities for discrete log-concave random variables via degree of freedom |
scientific article; zbMATH DE number 7756947 |
Statements
Entropy-variance inequalities for discrete log-concave random variables via degree of freedom (English)
0 references
30 October 2023
0 references
Let \(X\) be an integer-valued random variable with contiguous support, and suppose that \(X\) has a log-concave mass function, that is, \(\mathbb{P}(X=x)^2\geq\mathbb{P}(X=x-1)\mathbb{P}(X=x+1)\) for all \(x\). For \(\alpha>0\) and \(\alpha\not=1\), let \[ H_{\alpha}(X)=\frac{1}{1-\alpha}\log\sum_{x}\mathbb{P}(X=x)^\alpha \] be the Rényi entropy of order \(\alpha\), with the limit as \(\alpha\to\infty\) the min-entropy \(H_\infty(X)=-\log\max_x\mathbb{P}(X=x)\), and the limit as \(\alpha\to1\) being the usual Shannon entropy. The author shows that the entropy power \(N_\alpha(X)=e^{2H_\alpha(X)}\) satisfies \[ N_\infty(X)\geq1+\mbox{Var}(X)\,, \] with equality asymptotically achieved for a geometric distribution with parameter going to either 0 or 1. This result is used to establish the following inequality for entropy power: if \(S_n=X_1+\cdots+X_n\) is a sum of independent discrete log-concave random variables as above, then \[ \Delta_\alpha(S_n)\geq\frac{\alpha-1}{4(3\alpha-1)}\sum_{i=1}^n\Delta_\alpha(X_i) \] for \(\alpha>1\), where \(\Delta_{\alpha}(X)=N_\alpha(X)-1\). The proof of the main result makes use of a definition of the degree of freedom of a log-concave sequence.
0 references
log-concave distributions
0 references
degree of freedom
0 references
extreme points
0 references
Rényi entropy
0 references
entropy power inequalities
0 references
0 references
0 references
0 references
0 references