On one extreme value problem for entropy and error probability
From MaRDI portal
Publication:2263005
DOI10.1134/S003294601403016zbMATH Open1321.94025OpenAlexW2041262889MaRDI QIDQ2263005FDOQ2263005
Authors: V. V. Prelov
Publication date: 17 March 2015
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s003294601403016
Recommendations
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cites Work
- Estimating Mutual Information Via Kolmogorov Distance
- On the Interplay Between Conditional Entropy and Error Probability
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- Local Pinsker Inequalities via Stein's Discrete Density Approach
- The Interplay Between Entropy and Variational Distance
- On estimation of information via variation
- Mutual information, variation, and Fano's inequality
- Generalization of a Pinsker problem
Cited In (7)
- An Optimization Problem Related to the Zeta-function
- On joint and conditional entropies
- Title not available (Why is that?)
- On some extremal problems for mutual information and entropy
- An entropy maximization problem related to optical communication (Corresp.)
- On extreme values of the Rényi entropy under coupling of probability distributions
- Title not available (Why is that?)
This page was built for publication: On one extreme value problem for entropy and error probability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2263005)