A note on entropy estimation
From MaRDI portal
Publication:5380327
Abstract: We compare an entropy estimator recently discussed in [10] with two estimators and introduced in [6][7]. We prove the identity , which has not been taken into account in [10]. Then, we prove that the statistical bias of is less than the bias of the ordinary likelihood estimator of entropy. Finally, by numerical simulation we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator has a significant smaller statistical error than .
Recommendations
Cites work
- scientific article; zbMATH DE number 4062892 (Why is no real title available?)
- scientific article; zbMATH DE number 3568534 (Why is no real title available?)
- scientific article; zbMATH DE number 3273551 (Why is no real title available?)
- Bayesian entropy estimation for countable discrete distributions
- Bias analysis in entropy estimation
- Convergence properties of functional estimates for discrete distributions
- Entropy estimation in Turing's perspective
- Entropy estimation of symbol sequences
- Measurement of Diversity
- Prediction and Entropy of Printed English
Cited in
(10)- Entropy estimation in Turing's perspective
- scientific article; zbMATH DE number 3954044 (Why is no real title available?)
- Estimating the entropy of binary time series: methodology, some theory and a simulation study
- Entropy estimates of small data sets
- An asymptotic lower bound for the entropy of discrete populations with application to the estimation of entropy for approximately uniform populations
- A note on entropy optimization
- Estimating entropy rate from censored symbolic time series: A test for time-irreversibility
- A proof of the estimation from below in Pesin's entropy formula
- Symbolic partition in chaotic maps
- Scaling behaviour of entropy estimates
This page was built for publication: A note on entropy estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380327)