k_n-nearest neighbor estimators of entropy
DOI10.3103/S106653070803006XzbMATH Open1231.62047OpenAlexW2022259473WikidataQ24855763 ScholiaQ24855763MaRDI QIDQ734546FDOQ734546
Shengqiao Li, Robert Mnatsakanov, Neeraj Misra, E. James Harner
Publication date: 13 October 2009
Published in: Mathematical Methods of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3103/s106653070803006x
Recommendations
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- A class of Rényi information estimators for multidimensional densities
- \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions
- Estimation of entropies and divergences via nearest neighbors
Statistical aspects of information-theoretic topics (62B10) Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Multivariate distribution of statistics (62H10) Measures of information, entropy (94A17)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sample estimate of the entropy of a random vector
- Probabilistic model for two dependent circular variables
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- A Nonparametric Estimate of a Multivariate Density Function
- On the estimation of entropy
- Title not available (Why is that?)
- Entropy-Based Tests of Uniformity
- A class of Rényi information estimators for multidimensional densities
- Estimation of entropy and other functionals of a multivariate density
- Title not available (Why is that?)
- Title not available (Why is that?)
- Properties of the statistical estimate of the entropy of a random vector with a probability density
Cited In (14)
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Estimation of multivariate Shannon entropy using moments
- Parametric Bayesian estimation of differential entropy and relative entropy
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- Information estimators for weighted observations
- A novel nonparametric distance estimator for densities with error bounds
- Bias reduction via linear combination of nearest neighbour entropy estimators
- Improvement of \(K_ 2\)-entropy calculations by means of dimension scaled distances
- Estimation of entropies and divergences via nearest neighbors
- Nearest neighbor estimates of entropy for multivariate circular distributions
- Functional sufficient dimension reduction through information maximization with application to classification
- \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions
- Relaxation labelling and the entropy of neighbourhood information
- Entropy expressions and their estimators for multivariate distributions
This page was built for publication: \(k_n\)-nearest neighbor estimators of entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q734546)