On inequalities between mutual information and variation
From MaRDI portal
(Redirected from Publication:415703)
Recommendations
- Mutual information, variation, and Fano's inequality
- Mutual information of several random variables and its estimation via variation
- Generalization of a Pinsker problem
- Estimating Mutual Information Via Kolmogorov Distance
- On computation of information via variation and inequalities for the entropy function
Cites work
Cited in
(19)- Proofs of conservation inequalities for Levin's notion of mutual information of 1974
- The final form of Tao's inequality relating conditional expectation and conditional mutual information
- On one extremal problem for mutual information
- Mutual information of several random variables and its estimation via variation
- Correlation distance and bounds for mutual information
- Generalization of a Pinsker problem
- scientific article; zbMATH DE number 5017031 (Why is no real title available?)
- On computation of information via variation and inequalities for the entropy function
- On some extremal problems for mutual information and entropy
- Some relations between mutual information and estimation error in Wiener space
- The role of mutual information in variational classifiers
- Dictator functions maximize mutual information
- Mutual information and the \(F\)-theorem
- Mutual information, variation, and Fano's inequality
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- Remarks on reverse Pinsker inequalities
- scientific article; zbMATH DE number 4191548 (Why is no real title available?)
- Refinements of Pinsker's inequality
- Mutual Information Bounds via Adjacency Events
This page was built for publication: On inequalities between mutual information and variation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q415703)