On one extreme value problem for entropy and error probability
From MaRDI portal
Publication:2263005
DOI10.1134/S003294601403016zbMATH Open1321.94025OpenAlexW2041262889MaRDI QIDQ2263005FDOQ2263005
Publication date: 17 March 2015
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s003294601403016
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cites Work
- Estimating Mutual Information Via Kolmogorov Distance
- On the Interplay Between Conditional Entropy and Error Probability
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- Local Pinsker Inequalities via Stein's Discrete Density Approach
- The Interplay Between Entropy and Variational Distance
- On estimation of information via variation
- Mutual information, variation, and Fano's inequality
- Generalization of a Pinsker problem
Cited In (4)
This page was built for publication: On one extreme value problem for entropy and error probability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2263005)