Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality
From MaRDI portal
Publication:5281202
DOI10.1109/TIT.2010.2070570zbMATH Open1366.94194DBLPjournals/tit/JohnsonY10arXiv0909.0641WikidataQ60522089 ScholiaQ60522089MaRDI QIDQ5281202FDOQ5281202
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the Entropy Power Inequality do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannon's Entropy Power Inequality is the behaviour of entropy on scaling of continuous random variables. We believe that R'{e}nyi's operation of thinning discrete random variables plays a similar role to scaling, and give a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning. In the spirit of the monotonicity results established by Artstein, Ball, Barthe and Naor, we prove a stronger version of concavity of entropy, which implies a strengthened form of our discrete Entropy Power Inequality.
Full work available at URL: https://arxiv.org/abs/0909.0641
Recommendations
- A generalization of the entropy power inequality with applications
- Entropy and thinning of discrete random variables
- Variants of the Entropy Power Inequality
- Partial monotonicity of entropy measures
- On some inequalities for entropies of discrete probability distributions
- On partial monotonic behaviour of some entropy measures
- On Rényi Entropy Power Inequalities
- A proof of the Shepp-Olkin entropy monotonicity conjecture
- A Generalization of Costa’s Entropy Power Inequality
- A Strong Entropy Power Inequality
Cited In (8)
- Concentration functions and entropy bounds for discrete log-concave distributions
- A discrete complement of Lyapunov's inequality and its information theoretic consequences
- The one-mode quantum-limited Gaussian attenuator and amplifier have Gaussian maximizers
- Entropy power inequalities for qudits
- Gaussian optimizers for entropic inequalities in quantum information
- Majorization and Rényi entropy inequalities via Sperner theory
- Bernoulli sums and Rényi entropy inequalities
- Entropy Inequalities for Sums in Prime Cyclic Groups
This page was built for publication: Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281202)