Nearest-neighbor entropy estimators with weak metrics
From MaRDI portal
Publication:476332
DOI10.3934/AMC.2014.8.119zbMATH Open1358.94047arXiv1205.5856OpenAlexW2962863878MaRDI QIDQ476332FDOQ476332
Authors: J. Herrera, Sumit K. Garg
Publication date: 28 November 2014
Published in: Advances in Mathematics of Communications (Search for Journal in Brave)
Abstract: A problem of improving the accuracy of nonparametric entropy estimation for a stationary ergodic process is considered. New weak metrics are introduced and relations between metrics, measures, and entropy are discussed. Based on weak metrics, a new nearest-neighbor entropy estimator is constructed and has a parameter with which the estimator is optimized to reduce its bias. It is shown that estimator's variance is upper-bounded by a nearly optimal Cramer-Rao lower bound.
Full work available at URL: https://arxiv.org/abs/1205.5856
Recommendations
- Selection of a metric for the nearest neighbor entropy estimators
- Bias reduction via linear combination of nearest neighbour entropy estimators
- Bias of a nonparametric entropy estimator for Markov measures
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- Entropy estimators with almost sure convergence and an \(\mathcal O(n^{-1})\) variance
Nonparametric estimation (62G05) Measures of information, entropy (94A17) Stationary stochastic processes (60G10)
Cites Work
- Title not available (Why is that?)
- Compression of individual sequences via variable-rate coding
- A diffusion limit for a class of randomly-growing binary trees
- Encyclopedia of Distances
- Bias of a nonparametric entropy estimator for Markov measures
- Statistical estimation of measure invariants
- Title not available (Why is that?)
- Entropy estimators with almost sure convergence and an \(\mathcal O(n^{-1})\) variance
Cited In (7)
- Nearest neighbor estimates of Kaniadakis entropy
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- Bias reduction via linear combination of nearest neighbour entropy estimators
- Selection of a metric for the nearest neighbor entropy estimators
- Entropy estimators with almost sure convergence and an \(\mathcal O(n^{-1})\) variance
- A metric entropy bound is not sufficient for learnability
- Relaxation labelling and the entropy of neighbourhood information
This page was built for publication: Nearest-neighbor entropy estimators with weak metrics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q476332)