Generalized Entropy Power Inequalities and Monotonicity Properties of Information

From MaRDI portal
Publication:3549075

DOI10.1109/TIT.2007.899484zbMath1326.94034arXivcs/0605047OpenAlexW2137226437MaRDI QIDQ3549075

Mokshay Madiman, Andrew R. Barron

Publication date: 21 December 2008

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/cs/0605047




Related Items (27)

Concentration functions and entropy bounds for discrete log-concave distributionsANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMESEntropy and set cardinality inequalities for partition-determined functionsThe fractional Fisher information and the central limit theorem for stable lawsRate of convergence and Edgeworth-type expansion in the entropic central limit theoremScore functions, generalized relative Fisher information and applicationsBounds on the Poincaré constant for convolution measuresAn extension of entropy power inequality for dependent random variablesLog-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measuresInformation in Probability: Another Information-Theoretic Proof of a Finite de Finetti TheoremWeighted Brunn-Minkowski theory. I: On weighted surface area measuresAn integral representation of the relative entropyReverse Brunn-Minkowski and reverse entropy power inequalities for convex measuresEntropy and the discrete central limit theoremVolumes of subset Minkowski sums and the Lyusternik regionPoincaré-type inequalities for stable densitiesTwo Remarks on Generalized Entropy Power InequalitiesRényi divergence and the central limit theoremStam inequality on \(\mathbb Z_n\)Entropy inequalities for stable densities and strengthened central limit theoremsThe convexification effect of Minkowski summationContribution to the theory of Pitman estimatorsThe convergence of the Rényi entropy of the normalized sums of IID random variablesComplete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions\(K\)-averaging agent-based model: propagation of chaos and convergence to equilibriumMajorization and Rényi entropy inequalities via Sperner theoryExistence of Stein kernels under a spectral gap, and discrepancy bounds




This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information