Generalized Entropy Power Inequalities and Monotonicity Properties of Information
From MaRDI portal
Publication:3549075
Abstract: New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.
Cited in
(29)- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- Concentration functions and entropy bounds for discrete log-concave distributions
- The convexification effect of Minkowski summation
- Contribution to the theory of Pitman estimators
- Monotonicity of the logarithmic energy for random matrices
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- Two Remarks on Generalized Entropy Power Inequalities
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- An integral representation of the relative entropy
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- The fractional Fisher information and the central limit theorem for stable laws
- Weighted Brunn-Minkowski theory. I: On weighted surface area measures
- Poincaré-type inequalities for stable densities
- Entropy and set cardinality inequalities for partition-determined functions
- ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES
- \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium
- Majorization and Rényi entropy inequalities via Sperner theory
- Approximate discrete entropy monotonicity for log-concave sums
- Rényi divergence and the central limit theorem
- Score functions, generalized relative Fisher information and applications
- Entropy inequalities for stable densities and strengthened central limit theorems
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- An extension of entropy power inequality for dependent random variables
- Bounds on the Poincaré constant for convolution measures
- Stam inequality on Z_n
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Entropy and the discrete central limit theorem
- Volumes of subset Minkowski sums and the Lyusternik region
This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549075)