Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
From MaRDI portal
Publication:3548022
DOI10.1109/TIT.2006.880066zbMath1320.60111OpenAlexW2104072072MaRDI QIDQ3548022
Sergio Verdú, Antonia M. Tulino
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2006.880066
Related Items (12)
The fractional Fisher information and the central limit theorem for stable laws ⋮ A stroll along the gamma ⋮ Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ Entropy and the discrete central limit theorem ⋮ Volumes of subset Minkowski sums and the Lyusternik region ⋮ Two Remarks on Generalized Entropy Power Inequalities ⋮ Rényi divergence and the central limit theorem ⋮ Entropy inequalities for stable densities and strengthened central limit theorems ⋮ The convexification effect of Minkowski summation ⋮ The convergence of the Rényi entropy of the normalized sums of IID random variables ⋮ Existence of Stein kernels under a spectral gap, and discrepancy bounds
This page was built for publication: Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof