Entropy estimation of symbol sequences
From MaRDI portal
Publication:4526375
Abstract: We discuss algorithms for estimating the Shannon entropy h of finite symbol sequences with long range correlations. In particular, we consider algorithms which estimate h from the code lengths produced by some compression algorithm. Our interest is in describing their convergence with sequence length, assuming no limits for the space and time complexities of the compression algorithms. A scaling law is proposed for extrapolation from finite sample lengths. This is applied to sequences of dynamical systems in non-trivial chaotic regimes, a 1-D cellular automaton, and to written English texts.
Recommendations
Cites work
- A convergent gambling estimate of the entropy of English
- A note on the Ziv - Lempel model for compressing individual sequences (Corresp.)
- A universal data compression system
- A universal finite memory source
- Compression of individual sequences via variable-rate coding
- Enlarged scaling ranges for the KS‐entropy and the information dimension
- Entropy and prefixes
- LONG RANGE CORRELATION IN HUMAN WRITINGS
- Logical basis for information theory and probability theory
- On Information and Sufficiency
- On the Complexity of Finite Sequences
- On the Length of Programs for Computing Finite Binary Sequences
- On the symbolic dynamics of the Henon map
- Prediction and Entropy of Printed English
- Quantitative universality for a class of nonlinear transformations
- Random sequence generation by cellular automata
- Stochastic complexity and modeling
- The context-tree weighting method: basic properties
- The performance of universal encoding
- The redundancy of texts in three languages
- Universal modeling and coding
- Universal schemes for prediction, gambling and portfolio selection
Cited in
(43)- Optimal instruments and models for noisy chaos
- Artificial sequences and complexity measures
- A note on entropy estimation
- Symbolic dynamics of jejunal motility in the irritable bowel
- Large alphabets and incompressibility
- Estimating the entropy of binary time series: methodology, some theory and a simulation study
- SYNCHRONIZING TO THE ENVIRONMENT: INFORMATION-THEORETIC CONSTRAINTS ON AGENT LEARNING
- Exit-times and -entropy for dynamical systems, stochastic processes, and turbulence
- Modified correlation entropy estimation for a noisy chaotic time series
- Coincidences and estimation of entropies of random variables with large cardinalities
- Variance of entropy for testing time-varying regimes with an application to meme stocks
- On the non-randomness of maximum Lempel Ziv complexity sequences of finite size
- Regularities unseen, randomness observed: Levels of entropy convergence
- Non-sequential recursive pair substitution: some rigorous results
- Towards the quantification of the semantic information encoded in written language
- Contrasting stochasticity with chaos in a permutation Lempel-Ziv complexity -- Shannon entropy plane
- Complexity through nonextensivity
- scientific article; zbMATH DE number 1952702 (Why is no real title available?)
- Word frequency and entropy of symbolic sequences: A dynamical perspective
- Finite sample effects in sequence analysis
- Entropy of high-order Markov chains beyond the pair correlations
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains
- scientific article; zbMATH DE number 4090422 (Why is no real title available?)
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR
- Convergence properties of functional estimates for discrete distributions
- Entropy analysis of word-length series of natural language texts: effects of text language and genre
- Non-sequential recursive pair substitutions and numerical entropy estimates in symbolic dynamical systems
- Possible sets of autocorrelations and the Simplex algorithm
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Estimating entropy rate from censored symbolic time series: A test for time-irreversibility
- Guessing probability distributions from small samples
- Entropy of underdetermined sequences under constraints to specifications
- GeoEntropy: a measure of complexity and similarity
- Symbolic partition in chaotic maps
- Entropy of natural languages: Theory and experiment
- Transfer entropy on symbolic recurrences
- Price predictability at ultra-high frequency: entropy-based randomness test
- Phenomenology of coupled nonlinear oscillators
- Forecasting the unemployment rate over districts with the use of distinct methods
- Compactness of symbolic sequences from chaotic systems
- scientific article; zbMATH DE number 4062892 (Why is no real title available?)
- Maximally predictive states: from partial observations to long timescales
- Computational capabilities at the edge of chaos for one dimensional systems undergoing continuous transitions
This page was built for publication: Entropy estimation of symbol sequences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4526375)