Universal estimation of information measures for analog sources
From MaRDI portal
Publication:3589014
DOI10.1561/0100000021zbMATH Open1194.94174OpenAlexW2095324495MaRDI QIDQ3589014FDOQ3589014
Authors: Qing Wang, Sanjeev R. Kulkarni, Sergio Verdú
Publication date: 10 September 2010
Published in: Foundations and Trends™ in Communications and Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1561/0100000021
Recommendations
Cited In (5)
- Optimal rates of entropy estimation over Lipschitz balls
- Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence
- The interplay between information and estimation measures
- Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
- Universal Divergence Estimation for Finite-Alphabet Sources
This page was built for publication: Universal estimation of information measures for analog sources
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3589014)