Entropy estimation of symbol sequences

From MaRDI portal
Publication:4526375

DOI10.1063/1.166191zbMATH Open1055.94508arXivcond-mat/0203436OpenAlexW2018891628WikidataQ34202017 ScholiaQ34202017MaRDI QIDQ4526375FDOQ4526375


Authors: Thomas Schürmann, P. Grassberger Edit this on Wikidata


Publication date: 16 January 2001

Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)

Abstract: We discuss algorithms for estimating the Shannon entropy h of finite symbol sequences with long range correlations. In particular, we consider algorithms which estimate h from the code lengths produced by some compression algorithm. Our interest is in describing their convergence with sequence length, assuming no limits for the space and time complexities of the compression algorithms. A scaling law is proposed for extrapolation from finite sample lengths. This is applied to sequences of dynamical systems in non-trivial chaotic regimes, a 1-D cellular automaton, and to written English texts.


Full work available at URL: https://arxiv.org/abs/cond-mat/0203436




Recommendations



Cites Work


Cited In (42)





This page was built for publication: Entropy estimation of symbol sequences

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4526375)