Entropy Estimation in Turing's Perspective
From MaRDI portal
Publication:2919410
DOI10.1162/NECO_A_00266zbMath1262.94015OpenAlexW2141753389WikidataQ43429954 ScholiaQ43429954MaRDI QIDQ2919410
Publication date: 2 October 2012
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00266
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (7)
A multivariate normal law for Turing's formulae ⋮ On Sub-Gaussian Concentration of Missing Mass ⋮ A mutual information estimator with exponentially decaying bias ⋮ A Note on Entropy Estimation ⋮ Nonparametric Estimation of Küllback-Leibler Divergence ⋮ Bias adjustment for a nonparametric entropy estimator ⋮ Entropic representation and estimation of diversity indices
Cites Work
- Asymptotic normality of a nonparametric estimator of sample coverage
- Re-parameterization of multinomial distributions and diversity indices
- A normal limit law for a nonparametric estimator of the coverage of a random sample
- Convergence properties of functional estimates for discrete distributions
- Bias analysis in entropy estimation
- A sufficient normality condition for Turing's formula
- Estimation of Entropy and Mutual Information
- Estimating the Total Probability of the Unobserved Outcomes of an Experiment
- Measurement of Diversity
- A Class of Statistics with Asymptotically Normal Distribution
- THE POPULATION FREQUENCIES OF SPECIES AND THE ESTIMATION OF POPULATION PARAMETERS
This page was built for publication: Entropy Estimation in Turing's Perspective