Information Theoretic Proofs of Entropy Power Inequalities

From MaRDI portal
Publication:5281108

DOI10.1109/TIT.2010.2090193zbMATH Open1366.94205arXiv0704.1751OpenAlexW3105573210MaRDI QIDQ5281108FDOQ5281108


Authors: Olivier Rioul Edit this on Wikidata


Publication date: 27 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.


Full work available at URL: https://arxiv.org/abs/0704.1751







Cited In (20)





This page was built for publication: Information Theoretic Proofs of Entropy Power Inequalities

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281108)