\(k_n\)-nearest neighbor estimators of entropy
From MaRDI portal
Publication:734546
DOI10.3103/S106653070803006XzbMath1231.62047OpenAlexW2022259473WikidataQ24855763 ScholiaQ24855763MaRDI QIDQ734546
Robert M. Mnatsakanov, Shengqiao Li, Neeraj Misra, E. James Harner
Publication date: 13 October 2009
Published in: Mathematical Methods of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3103/s106653070803006x
Multivariate distribution of statistics (62H10) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Information estimators for weighted observations, Parametric Bayesian estimation of differential entropy and relative entropy, Nearest neighbor estimates of entropy for multivariate circular distributions, \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions, Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances, ESTIMATION OF MULTIVARIATE SHANNON ENTROPY USING MOMENTS, A novel nonparametric distance estimator for densities with error bounds
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Properties of the statistical estimate of the entropy of a random vector with a probability density
- Estimation of entropy and other functionals of a multivariate density
- A class of Rényi information estimators for multidimensional densities
- Sample estimate of the entropy of a random vector
- On the estimation of entropy
- Entropy-Based Tests of Uniformity
- Probabilistic model for two dependent circular variables
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- A Nonparametric Estimate of a Multivariate Density Function