Generalized Entropy Power Inequalities and Monotonicity Properties of Information
From MaRDI portal
Publication:3549075
DOI10.1109/TIT.2007.899484zbMATH Open1326.94034arXivcs/0605047OpenAlexW2137226437MaRDI QIDQ3549075FDOQ3549075
Authors: Mokshay Madiman, Andrew R. Barron
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.
Full work available at URL: https://arxiv.org/abs/cs/0605047
Cited In (29)
- Concentration functions and entropy bounds for discrete log-concave distributions
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- The convexification effect of Minkowski summation
- Contribution to the theory of Pitman estimators
- Monotonicity of the logarithmic energy for random matrices
- Two Remarks on Generalized Entropy Power Inequalities
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- An integral representation of the relative entropy
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Weighted Brunn-Minkowski theory. I: On weighted surface area measures
- The fractional Fisher information and the central limit theorem for stable laws
- Poincaré-type inequalities for stable densities
- ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES
- Entropy and set cardinality inequalities for partition-determined functions
- Approximate discrete entropy monotonicity for log-concave sums
- \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium
- Majorization and Rényi entropy inequalities via Sperner theory
- Rényi divergence and the central limit theorem
- Score functions, generalized relative Fisher information and applications
- Entropy inequalities for stable densities and strengthened central limit theorems
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- An extension of entropy power inequality for dependent random variables
- Bounds on the Poincaré constant for convolution measures
- Entropy and the discrete central limit theorem
- Volumes of subset Minkowski sums and the Lyusternik region
- Stam inequality on \(\mathbb Z_n\)
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549075)