Information Inequalities for Joint Distributions, With Interpretations and Applications
From MaRDI portal
Publication:5281447
DOI10.1109/TIT.2010.2046253zbMATH Open1366.94198arXiv0901.0044MaRDI QIDQ5281447FDOQ5281447
Authors: Mokshay Madiman, Prasad Tetali
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon's chain rule for entropy as well as inequalities of Han, Fujishige and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-- thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as new determinantal inequalities in matrix theory. A new inequality for relative entropies is also developed, along with interpretations in terms of hypothesis testing. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.
Full work available at URL: https://arxiv.org/abs/0901.0044
Cited In (33)
- Concentration of measure, classification of submeasures, and dynamics of \(L_0\)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Block factorization of the relative entropy via spatial mixing
- The number of independent sets in an irregular graph
- Matchings and independent sets of a fixed size in regular graphs
- The convexification effect of Minkowski summation
- Information theoretic approach to statistical properties of multivariate Cauchy-Lorentz distributions
- Maximizing \(H\)-colorings of a regular graph
- Mean mutual information and symmetry breaking for finite random fields
- Entropy inequalities for sums in prime cyclic groups
- Projections, entropy and sumsets
- Independent sets in graphs
- The Information Theory of Comparisons
- Entropy and set cardinality inequalities for partition-determined functions
- Entropy production in nonlinear recombination models
- Sumsets and entropy
- Entropy versions of additive inequalities
- Majorization and Rényi entropy inequalities via Sperner theory
- Mutual Information Bounds via Adjacency Events
- Title not available (Why is that?)
- Inference Under Information Constraints II: Communication Constraints and Shared Randomness
- Multi-variate correlation and mixtures of product measures.
- Rigidity of proper colorings of \(\mathbb{Z}^d \)
- Modified information criteria for a uniform approximate equivalence of probability distributions
- Approximate tensorization of entropy at high temperature
- An upper bound for the number of independent sets in regular graphs
- Submodular functions are noise stable
- Do Minkowski averages get progressively more convex?
- Some inequalities for information divergence and related measures of discrimination
- Volumes of subset Minkowski sums and the Lyusternik region
- Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions
- On the volume of the Minkowski sum of zonoids
This page was built for publication: Information Inequalities for Joint Distributions, With Interpretations and Applications
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281447)