On inequalities between mutual information and variation
From MaRDI portal
Publication:415703
DOI10.1134/S0032946007010024zbMATH Open1237.94029MaRDI QIDQ415703FDOQ415703
Publication date: 9 May 2012
Published in: Problems of Information Transmission (Search for Journal in Brave)
Recommendations
- Mutual information, variation, and Fano's inequality
- Mutual information of several random variables and its estimation via variation
- Generalization of a Pinsker problem
- Estimating Mutual Information Via Kolmogorov Distance
- On computation of information via variation and inequalities for the entropy function
Cites Work
Cited In (15)
- Some relations between mutual information and estimation error in Wiener space
- Title not available (Why is that?)
- The role of mutual information in variational classifiers
- Refinements of Pinsker's inequality
- Remarks on reverse Pinsker inequalities
- On computation of information via variation and inequalities for the entropy function
- Mutual information, variation, and Fano's inequality
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- Mutual information of several random variables and its estimation via variation
- On one extremal problem for mutual information
- Mutual information and the \(F\)-theorem
- On some extremal problems for mutual information and entropy
- Title not available (Why is that?)
- Mutual Information Bounds via Adjacency Events
- Generalization of a Pinsker problem
This page was built for publication: On inequalities between mutual information and variation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q415703)