Bias analysis in entropy estimation
From MaRDI portal
Publication:3159015
Abstract: We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error is analyzed.
Recommendations
- Estimation bias in maximum entropy models
- Bias adjustment for a nonparametric entropy estimator
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- Bias of a nonparametric entropy estimator for Markov measures
- Investigation on the high-order approximation of the entropy bias
- Analysis of entropy measures
- scientific article; zbMATH DE number 3954044
- On the estimation of entropy
- On the entropy estimators
Cited in
(16)- Entropy estimation in Turing's perspective
- A note on entropy estimation
- Golden-Hessian structures
- Estimation of Entropy and Mutual Information
- Estimation bias in maximum entropy models
- Nonparametric estimation of quantile-based entropy function
- Finite sample effects in sequence analysis
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- Bias adjustment for a nonparametric entropy estimator
- Mathematical characterization of private and public Immune receptor sequences
- Investigation on the high-order approximation of the entropy bias
- A mutual information estimator with exponentially decaying bias
- Entropy factor for randomness quantification in neuronal data
- An improved estimator of Shannon entropy with applications to systems with memory
- Symbolic partition in chaotic maps
- Measuring synchronization in coupled model systems: a comparison of different approaches
This page was built for publication: Bias analysis in entropy estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3159015)