Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
From MaRDI portal
Publication:2986198
DOI10.1109/TIT.2014.2322861zbMATH Open1360.94159arXiv1206.0489MaRDI QIDQ2986198FDOQ2986198
Authors: Ioannis Kontoyiannis, Mokshay Madiman
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: The sumset and inverse sumset theories of Freiman, Pl"{u}nnecke and Ruzsa, give bounds connecting the cardinality of the sumset of two discrete sets , to the cardinalities (or the finer structure) of the original sets . For example, the sum-difference bound of Ruzsa states that, , where the difference set . Interpreting the differential entropy of a continuous random variable as (the logarithm of) the size of the effective support of , the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, , for any pair of independent continuous random variables and . Our results include differential-entropy versions of Ruzsa's triangle inequality, the Pl"{u}nnecke-Ruzsa inequality, and the Balog-Szemer'{e}di-Gowers lemma. Also we give a differential entropy version of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of . Since differential entropy is {em not} functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.
Full work available at URL: https://arxiv.org/abs/1206.0489
Recommendations
- Sumset and Inverse Sumset Theory for Shannon Entropy
- scientific article; zbMATH DE number 1876969
- Sumsets and entropy
- Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy
- scientific article; zbMATH DE number 5017031
- On some extremal problems for mutual information and entropy
- On solution sets of information inequalities
- Some inequalities in information theory using Tsallis entropy
- On inequalities between mutual information and variation
- scientific article; zbMATH DE number 1121120
Cited In (12)
- An information-theoretic approach to study spatial dependencies in small datasets
- On the dimension of Bernoulli convolutions
- Absolute continuity of Bernoulli convolutions for algebraic parameters
- Entropy of Bernoulli convolutions and uniform exponential growth for linear groups
- Absolutely continuous self-similar measures with exponential separation
- Two Remarks on Generalized Entropy Power Inequalities
- Entropy inequalities for sums in prime cyclic groups
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Entropy versions of additive inequalities
- Majorization and Rényi entropy inequalities via Sperner theory
- Stability of the Shannon-Stam inequality via the Föllmer process
- Sumset and Inverse Sumset Theory for Shannon Entropy
This page was built for publication: Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986198)