Generalized Entropy Power Inequalities and Monotonicity Properties of Information

From MaRDI portal
Publication:3549075

DOI10.1109/TIT.2007.899484zbMATH Open1326.94034arXivcs/0605047OpenAlexW2137226437MaRDI QIDQ3549075FDOQ3549075


Authors: Mokshay Madiman, Andrew R. Barron Edit this on Wikidata


Publication date: 21 December 2008

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.


Full work available at URL: https://arxiv.org/abs/cs/0605047







Cited In (29)





This page was built for publication: Generalized Entropy Power Inequalities and Monotonicity Properties of Information

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549075)