Information Inequalities for Joint Distributions, With Interpretations and Applications
From MaRDI portal
Publication:5281447
Abstract: Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon's chain rule for entropy as well as inequalities of Han, Fujishige and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-- thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as new determinantal inequalities in matrix theory. A new inequality for relative entropies is also developed, along with interpretations in terms of hypothesis testing. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.
Cited in
(33)- Concentration of measure, classification of submeasures, and dynamics of \(L_0\)
- scientific article; zbMATH DE number 5017031 (Why is no real title available?)
- scientific article; zbMATH DE number 5526490 (Why is no real title available?)
- Matchings and independent sets of a fixed size in regular graphs
- Block factorization of the relative entropy via spatial mixing
- The number of independent sets in an irregular graph
- The convexification effect of Minkowski summation
- Information theoretic approach to statistical properties of multivariate Cauchy-Lorentz distributions
- Maximizing \(H\)-colorings of a regular graph
- Mean mutual information and symmetry breaking for finite random fields
- Entropy inequalities for sums in prime cyclic groups
- Projections, entropy and sumsets
- Independent sets in graphs
- The Information Theory of Comparisons
- Entropy and set cardinality inequalities for partition-determined functions
- Entropy production in nonlinear recombination models
- Sumsets and entropy
- Entropy versions of additive inequalities
- Majorization and Rényi entropy inequalities via Sperner theory
- Mutual Information Bounds via Adjacency Events
- scientific article; zbMATH DE number 1765864 (Why is no real title available?)
- Inference Under Information Constraints II: Communication Constraints and Shared Randomness
- Multi-variate correlation and mixtures of product measures.
- Rigidity of proper colorings of \(\mathbb{Z}^d \)
- Modified information criteria for a uniform approximate equivalence of probability distributions
- An upper bound for the number of independent sets in regular graphs
- Approximate tensorization of entropy at high temperature
- Do Minkowski averages get progressively more convex?
- Submodular functions are noise stable
- Some inequalities for information divergence and related measures of discrimination
- Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions
- Volumes of subset Minkowski sums and the Lyusternik region
- On the volume of the Minkowski sum of zonoids
This page was built for publication: Information Inequalities for Joint Distributions, With Interpretations and Applications
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281447)