Information Inequalities for Joint Distributions, With Interpretations and Applications

From MaRDI portal
Publication:5281447

DOI10.1109/TIT.2010.2046253zbMATH Open1366.94198arXiv0901.0044MaRDI QIDQ5281447FDOQ5281447


Authors: Mokshay Madiman, Prasad Tetali Edit this on Wikidata


Publication date: 27 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon's chain rule for entropy as well as inequalities of Han, Fujishige and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-- thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as new determinantal inequalities in matrix theory. A new inequality for relative entropies is also developed, along with interpretations in terms of hypothesis testing. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.


Full work available at URL: https://arxiv.org/abs/0901.0044







Cited In (33)





This page was built for publication: Information Inequalities for Joint Distributions, With Interpretations and Applications

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281447)