A Normal Law for the Plug-in Estimator of Entropy
From MaRDI portal
Publication:5271944
DOI10.1109/TIT.2011.2179702zbMath1365.62124OpenAlexW1985747475MaRDI QIDQ5271944
Publication date: 12 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2011.2179702
Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Measures of information, entropy (94A17) Coding theorems (Shannon theory) (94A24)
Related Items (5)
A quantum-mechanical derivation of the multivariate central limit theorem for Markov chains ⋮ Limit theorems for empirical Rényi entropy and divergence with applications to molecular diversity analysis ⋮ A mutual information estimator with exponentially decaying bias ⋮ Asymptotic normality for plug-in estimators of diversity indices on countable alphabets ⋮ Nonparametric Estimation of Küllback-Leibler Divergence
This page was built for publication: A Normal Law for the Plug-in Estimator of Entropy