Mutual Information Matrices Are Not Always Positive Semidefinite
From MaRDI portal
Publication:2986381
DOI10.1109/TIT.2014.2311434zbMATH Open1360.62027arXiv1307.6673OpenAlexW1975323904MaRDI QIDQ2986381FDOQ2986381
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: For discrete random variables X_1,..., X_n we construct an n by n matrix. In the (i,j) entry we put the mutual information I(X_i;X_j) between X_i and X_j. In particular, in the (i,i) entry we put the entropy H(X_i)=I(X_i;X_i) of X_i. This matrix, called the mutual information matrix of (X_1,...,X_n), has been conjectured to be positive semi-definite. In this note, we give counterexamples to the conjecture, and show that the conjecture holds for up to three random variables.
Full work available at URL: https://arxiv.org/abs/1307.6673
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (2)
This page was built for publication: Mutual Information Matrices Are Not Always Positive Semidefinite
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986381)