A Probabilistic Upper Bound on Differential Entropy

From MaRDI portal
Publication:3604918

DOI10.1109/TIT.2008.929937zbMATH Open1318.62014arXivcs/0504091MaRDI QIDQ3604918FDOQ3604918


Authors: Erik G. Learned-Miller, Joseph J. Destefano Edit this on Wikidata


Publication date: 24 February 2009

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: A novel, non-trivial, probabilistic upper bound on the entropy of an unknown one-dimensional distribution, given the support of the distribution and a sample from that distribution, is presented. No knowledge beyond the support of the unknown distribution is required, nor is the distribution required to have a density. Previous distribution-free bounds on the cumulative distribution function of a random variable given a sample of that variable are used to construct the bound. A simple, fast, and intuitive algorithm for computing the entropy bound from a sample is provided.


Full work available at URL: https://arxiv.org/abs/cs/0504091




Recommendations




Cited In (1)





This page was built for publication: A Probabilistic Upper Bound on Differential Entropy

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3604918)