Information Theoretic Proofs of Entropy Power Inequalities
From MaRDI portal
Publication:5281108
DOI10.1109/TIT.2010.2090193zbMATH Open1366.94205arXiv0704.1751OpenAlexW3105573210MaRDI QIDQ5281108FDOQ5281108
Authors: Olivier Rioul
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.
Full work available at URL: https://arxiv.org/abs/0704.1751
Cited In (20)
- A simple proof of the entropy-power inequality
- Clock synchronization and estimation in highly dynamic networks: an information theoretic approach
- A primer on alpha-information theory with application to leakage in secrecy systems
- A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI
- Asymptotic form of the Kullback-Leibler divergence for multivariate asymmetric heavy-tailed distributions
- The information-theoretic meaning of Gagliardo-Nirenberg type inequalities
- Entropy power inequalities for qudits
- On Shannon's formula and Hartley's rule: beyond the mathematical coincidence
- On the smoothed minimum error entropy criterion
- 4. On the power of random information
- Role of information theoretic uncertainty relations in quantum theory
- An information - theoretic proof of Hadamard's inequality (Corresp.)
- Conditional quantum entropy power inequality ford-level quantum systems
- Weighted \(p\)-Rényi entropy power inequality: information theory to quantum Shannon theory
- Quantum Rényi entropy functionals for bosonic Gaussian systems
- Weighted entropy: basic inequalities
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- An extension of entropy power inequality for dependent random variables
- A de Bruijn's identity for dependent random variables based on copula theory
- Information theoretic inequalities
This page was built for publication: Information Theoretic Proofs of Entropy Power Inequalities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281108)