Estimating Entropy on<tex>m</tex>Bins Given Fewer Than<tex>m</tex>Samples
From MaRDI portal
Publication:3547465
Recommendations
- Existence of an unbiased consistent entropy estimator for the special Bernoulli measure
- On entropy estimation for distributions with countable support.
- Approximating entropy from sublinear samples
- Density-free convergence properties of various estimators of entropy
- Entropy estimation in Turing's perspective
Cited in
(17)- Feature extraction from spike trains with Bayesian binning: `Latency is where the signal starts'
- Structural changes in large economic datasets: a nonparametric homogeneity test
- Optimal rates of entropy estimation over Lipschitz balls
- Testing probability distributions using conditional samples
- Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis
- Existence of an unbiased consistent entropy estimator for the special Bernoulli measure
- Sublinear algorithms for approximating string compressibility
- A Bernstein-von Mises theorem for discrete probability distributions
- Proofs of proximity for distribution testing
- Fluctuations of the Empirical Entropies of a Chain of Infinite Order
- Investigation on the high-order approximation of the entropy bias
- An automatic inequality prover and instance optimal identity testing
- Recovering structured probability matrices
- Quadratic Tsallis entropy bias and generalized maximum entropy models
- On entropy estimation for distributions with countable support.
- Indices for Testing Neural Codes
- Sample complexity of the distinct elements problem
This page was built for publication: Estimating Entropy on<tex>$m$</tex>Bins Given Fewer Than<tex>$m$</tex>Samples
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547465)