A Probabilistic Upper Bound on Differential Entropy
From MaRDI portal
Publication:3604918
Abstract: A novel, non-trivial, probabilistic upper bound on the entropy of an unknown one-dimensional distribution, given the support of the distribution and a sample from that distribution, is presented. No knowledge beyond the support of the unknown distribution is required, nor is the distribution required to have a density. Previous distribution-free bounds on the cumulative distribution function of a random variable given a sample of that variable are used to construct the bound. A simple, fast, and intuitive algorithm for computing the entropy bound from a sample is provided.
Recommendations
- An upper bound of the measure-theoretical entropy
- A refined upper bound for entropy
- Some upper bounds for relative entropy and applications
- A note on the upper bound for the difference between two entropies
- A tight upper bound on discrete entropy
- A new entropy upper bound
- Some bounds on entropy measures in information theory
- Bounds on general entropy measures
- Tight Bound on Relative Entropy by Entropy Difference
This page was built for publication: A Probabilistic Upper Bound on Differential Entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3604918)