A reverse entropy power inequality for log-concave random vectors
From MaRDI portal
Publication:2833667
DOI10.4064/sm8418-6-2016zbMath1407.94055arXiv1509.05926MaRDI QIDQ2833667
Tomasz Tkocz, Keith M. Ball, Piotr Nayar
Publication date: 18 November 2016
Published in: Studia Mathematica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1509.05926
60E15: Inequalities; stochastic orderings
52A40: Inequalities and extremum problems involving convexity in convex geometry
94A17: Measures of information, entropy
Related Items
Rényi entropy power inequality and a reverse, Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities, Two Remarks on Generalized Entropy Power Inequalities, The convexification effect of Minkowski summation, Gaussian mixtures: entropy and geometric inequalities, Stability of Cramer’s Characterization of Normal Laws in Information Distances
Cites Work
- A Mathematical Theory of Communication
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- Partitions of mass-distributions and of convex bodies by hyperplanes
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- Entropy production by block variable summation and central limit theorems
- Proof of an entropy conjecture of Wehrl
- Entropy jumps in the presence of a spectral gap
- On the rate of convergence in the entropic central limit theorem
- Fisher information inequalities and the central limit theorem
- On the Problem of Reversibility of the Entropy Power Inequality
- Stability Problems in Cramér-Type Characterization in case of I.I.D. Summands
- Entropy Power Inequality for the Rényi Entropy
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- A simple proof of the entropy-power inequality
- A new entropy power inequality
- Logarithmically concave functions and sections of convex sets in $R^{n}$
- Information theoretic inequalities
- On the maximum entropy of the sum of two dependent random variables
- A short proof of the "concavity of entropy power"
- Solution of Shannon’s problem on the monotonicity of entropy
- Entropy jumps for isotropic log-concave random vectors and spectral gap
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- The convolution inequality for entropy powers