On some extremal problems for mutual information and entropy
From MaRDI portal
Publication:2364451
DOI10.1134/S0032946016040013zbMATH Open1368.62013MaRDI QIDQ2364451FDOQ2364451
Authors: V. V. Prelov
Publication date: 21 July 2017
Published in: Problems of Information Transmission (Search for Journal in Brave)
Recommendations
Statistical aspects of information-theoretic topics (62B10) Information theory (general) (94A15) Measures of information, entropy (94A17)
Cites Work
- Estimating Mutual Information Via Kolmogorov Distance
- On the Interplay Between Conditional Entropy and Error Probability
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- Title not available (Why is that?)
- The Interplay Between Entropy and Variational Distance
- On inequalities between mutual information and variation
- Mutual information, variation, and Fano's inequality
- On one extreme value problem for entropy and error probability
- Title not available (Why is that?)
Cited In (12)
- Title not available (Why is that?)
- Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties
- On computation of information via variation and inequalities for the entropy function
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- On Exact and ∞-Rényi Common Informations
- On one extremal problem for mutual information
- Mutual information and the \(F\)-theorem
- On the entropy of couplings
- Title not available (Why is that?)
- Mutual Information Bounds via Adjacency Events
- Mutual dependence of random variables and maximum discretized entropy
- On one extreme value problem for entropy and error probability
This page was built for publication: On some extremal problems for mutual information and entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2364451)