Mutual information, variation, and Fano's inequality
From MaRDI portal
Publication:734252
DOI10.1134/S0032946008030022zbMATH Open1173.94401OpenAlexW2029005240MaRDI QIDQ734252FDOQ734252
Authors: V. V. Prelov, Edward C. van der Meulen
Publication date: 20 October 2009
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0032946008030022
Recommendations
Cites Work
Cited In (10)
- On inequalities between mutual information and variation
- On computation of information via variation and inequalities for the entropy function
- Generalizing the Fano inequality
- Mutual information of several random variables and its estimation via variation
- Mutual information and the \(F\)-theorem
- On some extremal problems for mutual information and entropy
- Mutual Information Bounds via Adjacency Events
- Generalization of a Pinsker problem
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
- On one extreme value problem for entropy and error probability
This page was built for publication: Mutual information, variation, and Fano's inequality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q734252)