A New Entropy Power Inequality for Integer-Valued Random Variables

From MaRDI portal
Publication:2986249

DOI10.1109/TIT.2014.2317181zbMATH Open1360.94155arXiv1301.4185MaRDI QIDQ2986249FDOQ2986249


Authors: Saeid Haghighatshoar, Emmanuel Abbe, İ. Emre Telatar Edit this on Wikidata


Publication date: 16 May 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: The entropy power inequality (EPI) provides lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function provides a sharp inequality H(X+X)H(X)geq1/2o(1) when X,X are i.i.d. with high entropy. This paper provides the inequality H(X+X)H(X)geqg(H(X)), where X,X are arbitrary i.i.d. integer-valued random variables and where g is a universal strictly positive function on mR+ satisfying g(0)=0. Extensions to non identically distributed random variables and to conditional entropies are also obtained.


Full work available at URL: https://arxiv.org/abs/1301.4185







Cited In (8)





This page was built for publication: A New Entropy Power Inequality for Integer-Valued Random Variables

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986249)