Beyond the Entropy Power Inequality, via Rearrangements
From MaRDI portal
Publication:2986133
DOI10.1109/TIT.2014.2338852zbMATH Open1360.62028arXiv1307.6018OpenAlexW1968021238MaRDI QIDQ2986133FDOQ2986133
Authors: Liyao Wang, Mokshay Madiman
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: A lower bound on the R'enyi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. For the special case of Boltzmann-Shannon entropy, this lower bound is better than that given by the entropy power inequality. Several applications are discussed, including a new proof of the classical entropy power inequality and an entropy inequality involving symmetrization of L'evy processes.
Full work available at URL: https://arxiv.org/abs/1307.6018
Recommendations
- On Rényi Entropy Power Inequalities
- Rényi entropy power inequality and a reverse
- A generalization of the entropy power inequality with applications
- A new entropy power inequality
- Yet Another Proof of the Entropy Power Inequality
- Variants of the Entropy Power Inequality
- On the Entropy Power Inequality for the Rényi Entropy of Order [0, 1]
- A Strong Entropy Power Inequality
- Two Remarks on Generalized Entropy Power Inequalities
- Entropy Power Inequality for the Rényi Entropy
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (18)
- On the Problem of Reversibility of the Entropy Power Inequality
- The convexification effect of Minkowski summation
- A Combinatorial Approach to Small Ball Inequalities for Sums and Differences
- Two Remarks on Generalized Entropy Power Inequalities
- Volume of the polar of random sets and shadow systems
- Weighted \(p\)-Rényi entropy power inequality: information theory to quantum Shannon theory
- Elaboration Models with Symmetric Information Divergence
- Quantum Rényi entropy functionals for bosonic Gaussian systems
- Majorization and Rényi entropy inequalities via Sperner theory
- A Log-Det Inequality for Random Matrices
- Bernoulli sums and Rényi entropy inequalities
- An inequality for the convolutions on unimodular locally compact groups and the optimal constant of Young's inequality
- Optimal Concentration of Information Content for Log-Concave Densities
- The norm of the Fourier transform on compact or discrete abelian groups
- Entropic exercises around the Kneser–Poulsen conjecture
- Entropy Inequalities for Sums in Prime Cyclic Groups
- Volumes of subset Minkowski sums and the Lyusternik region
- On the volume of the Minkowski sum of zonoids
This page was built for publication: Beyond the Entropy Power Inequality, via Rearrangements
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986133)