Nonparametric Estimation of Küllback-Leibler Divergence
From MaRDI portal
Publication:5383802
DOI10.1162/NECO_a_00646zbMath1419.62068DBLPjournals/neco/ZhangG14OpenAlexW2021531548WikidataQ51064641 ScholiaQ51064641MaRDI QIDQ5383802
Publication date: 20 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00646
Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Measures of information, entropy (94A17)
Related Items (1)
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Bias adjustment for a nonparametric entropy estimator
- Re-parameterization of multinomial distributions and diversity indices
- Entropy Estimation in Turing's Perspective
- Asymptotic Normality of an Entropy Estimator With Exponentially Decaying Bias
- Estimation of Entropy and Mutual Information
- Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances
- A Normal Law for the Plug-in Estimator of Entropy
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Elements of Information Theory
- On Information and Sufficiency
This page was built for publication: Nonparametric Estimation of Küllback-Leibler Divergence