Infinite Shannon entropy

From MaRDI portal
Publication:3301575

DOI10.1088/1742-5468/2013/04/P04010zbMATH Open1456.82006arXiv1212.5630OpenAlexW3104876448WikidataQ59619451 ScholiaQ59619451MaRDI QIDQ3301575FDOQ3301575


Authors: Valentina Baccetti, Matt Visser Edit this on Wikidata


Publication date: 11 August 2020

Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)

Abstract: Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann) entropy can easily be infinite. We carefully analyze conditions under which this phenomenon can occur. Roughly speaking, this happens when arbitrarily small amounts of probability are dispersed into an infinite number of states; we shall quantify this observation and make it precise. We develop several particularly simple, elementary, and useful bounds, and also provide some asymptotic estimates, leading to necessary and sufficient conditions for the occurrence of infinite Shannon entropy. We go to some effort to keep technical computations as simple and conceptually clear as possible. In particular, we shall see that large entropies cannot be localized in state space; large entropies can only be supported on an exponentially large number of states. We are for the time being interested in single-channel Shannon entropy in the information theoretic sense, not entropy in a stochastic field theory or QFT defined over some configuration space, on the grounds that this simple problem is a necessary precursor to understanding infinite entropy in a field theoretic context.


Full work available at URL: https://arxiv.org/abs/1212.5630




Recommendations



Cites Work


Cited In (6)





This page was built for publication: Infinite Shannon entropy

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3301575)