Generalized Entropy Power Inequalities and Monotonicity Properties of Information
From MaRDI portal
Publication:3549075
DOI10.1109/TIT.2007.899484zbMath1326.94034arXivcs/0605047OpenAlexW2137226437MaRDI QIDQ3549075
Mokshay Madiman, Andrew R. Barron
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/cs/0605047
Related Items (27)
Concentration functions and entropy bounds for discrete log-concave distributions ⋮ ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES ⋮ Entropy and set cardinality inequalities for partition-determined functions ⋮ The fractional Fisher information and the central limit theorem for stable laws ⋮ Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem ⋮ Score functions, generalized relative Fisher information and applications ⋮ Bounds on the Poincaré constant for convolution measures ⋮ An extension of entropy power inequality for dependent random variables ⋮ Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ Weighted Brunn-Minkowski theory. I: On weighted surface area measures ⋮ An integral representation of the relative entropy ⋮ Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures ⋮ Entropy and the discrete central limit theorem ⋮ Volumes of subset Minkowski sums and the Lyusternik region ⋮ Poincaré-type inequalities for stable densities ⋮ Two Remarks on Generalized Entropy Power Inequalities ⋮ Rényi divergence and the central limit theorem ⋮ Stam inequality on \(\mathbb Z_n\) ⋮ Entropy inequalities for stable densities and strengthened central limit theorems ⋮ The convexification effect of Minkowski summation ⋮ Contribution to the theory of Pitman estimators ⋮ The convergence of the Rényi entropy of the normalized sums of IID random variables ⋮ Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions ⋮ \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium ⋮ Majorization and Rényi entropy inequalities via Sperner theory ⋮ Existence of Stein kernels under a spectral gap, and discrepancy bounds
This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information