Entropy estimation of symbol sequences
From MaRDI portal
Publication:4526375
DOI10.1063/1.166191zbMath1055.94508arXivcond-mat/0203436OpenAlexW2018891628WikidataQ34202017 ScholiaQ34202017MaRDI QIDQ4526375
Thomas Schürmann, Peter Grassberger
Publication date: 16 January 2001
Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/cond-mat/0203436
Protein sequences, DNA sequences (92D20) Applications of dynamical systems (37N99) Dynamical aspects of cellular automata (37B15) Measures of information, entropy (94A17)
Related Items (24)
Contrasting stochasticity with chaos in a permutation Lempel-Ziv complexity -- Shannon entropy plane ⋮ GeoEntropy: a measure of complexity and similarity ⋮ Phenomenology of coupled nonlinear oscillators ⋮ BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR ⋮ Coincidences and estimation of entropies of random variables with large cardinalities ⋮ Forecasting the unemployment rate over districts with the use of distinct methods ⋮ Computational capabilities at the edge of chaos for one dimensional systems undergoing continuous transitions ⋮ A Note on Entropy Estimation ⋮ Symbolic dynamics of jejunal motility in the irritable bowel ⋮ Regularities unseen, randomness observed: Levels of entropy convergence ⋮ TOWARDS THE QUANTIFICATION OF THE SEMANTIC INFORMATION ENCODED IN WRITTEN LANGUAGE ⋮ Estimating Entropy Rates with Bayesian Confidence Intervals ⋮ Convergence properties of functional estimates for discrete distributions ⋮ Complexity through nonextensivity ⋮ On the non-randomness of maximum Lempel Ziv complexity sequences of finite size ⋮ SYNCHRONIZING TO THE ENVIRONMENT: INFORMATION-THEORETIC CONSTRAINTS ON AGENT LEARNING ⋮ Estimating Information Rates with Confidence Intervals in Neural Spike Trains ⋮ Artificial sequences and complexity measures ⋮ Symbolic partition in chaotic maps ⋮ Optimal instruments and models for noisy chaos ⋮ Estimating entropy rate from censored symbolic time series: A test for time-irreversibility ⋮ Non-sequential recursive pair substitution: some rigorous results ⋮ Modified correlation entropy estimation for a noisy chaotic time series ⋮ Exit-times and \(\varepsilon\)-entropy for dynamical systems, stochastic processes, and turbulence
Cites Work
- Stochastic complexity and modeling
- Random sequence generation by cellular automata
- Entropy and prefixes
- Universal schemes for prediction, gambling and portfolio selection
- Quantitative universality for a class of nonlinear transformations
- On the symbolic dynamics of the Henon map
- The redundancy of texts in three languages
- Prediction and Entropy of Printed English
- A universal data compression system
- Universal modeling and coding
- The performance of universal encoding
- A note on the Ziv - Lempel model for compressing individual sequences (Corresp.)
- On the Complexity of Finite Sequences
- A convergent gambling estimate of the entropy of English
- Compression of individual sequences via variable-rate coding
- LONG RANGE CORRELATION IN HUMAN WRITINGS
- Enlarged scaling ranges for the KS‐entropy and the information dimension
- A universal finite memory source
- The context-tree weighting method: basic properties
- On the Length of Programs for Computing Finite Binary Sequences
- Logical basis for information theory and probability theory
- On Information and Sufficiency
This page was built for publication: Entropy estimation of symbol sequences