Generalized Entropy Power Inequalities and Monotonicity Properties of Information
From MaRDI portal
Publication:3549075
Abstract: New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.
Cited in
(29)- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- An extension of entropy power inequality for dependent random variables
- Concentration functions and entropy bounds for discrete log-concave distributions
- ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES
- Entropy and set cardinality inequalities for partition-determined functions
- Stam inequality on \(\mathbb Z_n\)
- Majorization and Rényi entropy inequalities via Sperner theory
- An integral representation of the relative entropy
- Bounds on the Poincaré constant for convolution measures
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Monotonicity of the logarithmic energy for random matrices
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Contribution to the theory of Pitman estimators
- Weighted Brunn-Minkowski theory. I: On weighted surface area measures
- Score functions, generalized relative Fisher information and applications
- The fractional Fisher information and the central limit theorem for stable laws
- Poincaré-type inequalities for stable densities
- Approximate discrete entropy monotonicity for log-concave sums
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- Rényi divergence and the central limit theorem
- The convexification effect of Minkowski summation
- Two Remarks on Generalized Entropy Power Inequalities
- Entropy and the discrete central limit theorem
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium
- Volumes of subset Minkowski sums and the Lyusternik region
- Entropy inequalities for stable densities and strengthened central limit theorems
This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549075)