A Probabilistic Upper Bound on Differential Entropy
From MaRDI portal
Publication:3604918
DOI10.1109/TIT.2008.929937zbMATH Open1318.62014arXivcs/0504091MaRDI QIDQ3604918FDOQ3604918
Authors: Erik G. Learned-Miller, Joseph J. Destefano
Publication date: 24 February 2009
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: A novel, non-trivial, probabilistic upper bound on the entropy of an unknown one-dimensional distribution, given the support of the distribution and a sample from that distribution, is presented. No knowledge beyond the support of the unknown distribution is required, nor is the distribution required to have a density. Previous distribution-free bounds on the cumulative distribution function of a random variable given a sample of that variable are used to construct the bound. A simple, fast, and intuitive algorithm for computing the entropy bound from a sample is provided.
Full work available at URL: https://arxiv.org/abs/cs/0504091
Recommendations
- An upper bound of the measure-theoretical entropy
- A refined upper bound for entropy
- Some upper bounds for relative entropy and applications
- A note on the upper bound for the difference between two entropies
- A tight upper bound on discrete entropy
- A new entropy upper bound
- Some bounds on entropy measures in information theory
- Bounds on general entropy measures
- Tight Bound on Relative Entropy by Entropy Difference
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (1)
This page was built for publication: A Probabilistic Upper Bound on Differential Entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3604918)