Entropic measures, Markov information sources and complexity
DOI10.1016/S0096-3003(01)00199-0zbMATH Open1029.94008OpenAlexW2047066710WikidataQ57001704 ScholiaQ57001704MaRDI QIDQ1855845FDOQ1855845
Authors: Cristian S. Calude, Monica Dumitrescu
Publication date: 28 January 2003
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0096-3003(01)00199-0
Recommendations
complexityalgorithmic probabilityShannon's entropyMarkov sourcesentropy rateentropic measuresMarkov information sources
Statistical aspects of information-theoretic topics (62B10) Markov chains (discrete-time Markov processes on discrete state spaces) (60J10) Measures of information, entropy (94A17) Continuous-time Markov processes on discrete state spaces (60J27) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30)
Cites Work
- Title not available (Why is that?)
- A Mathematical Theory of Communication
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Finite Continuous Time Markov Chains
- Information, randomness and incompleteness. Papers on algorithmic information theory
- Title not available (Why is that?)
- Kolmogorov's contributions to information theory and algorithmic complexity
- Title not available (Why is that?)
- Title not available (Why is that?)
- Algorithmic information and simplicity in statistical physics
- A proof of the Beyer-Stein-Ulam relation between complexity and entropy
- Some informational properties of Markov pure-jump processes
- Title not available (Why is that?)
- RANDOMNESS AND COMPLEXITY IN PURE MATHEMATICS
- Title not available (Why is that?)
- Title not available (Why is that?)
- Coding with minimal programs
Cited In (15)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Entropy and higher moments of information
- Reexamination of an information geometric construction of entropic indicators of complexity
- Entropy, search, complexity.
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- A note on Kolmogorov complexity and entropy
- Title not available (Why is that?)
- Complex entropy and resultant information measures
- The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems
- Complexity of strings in the class of Markov sources
- Chaitin complexity, Shannon information content of a single event, and infinite random sequences. II
- On degrees of randomness and genetic randomness
- Complexity measures in terms of general dynamics: the information capacitance
- Building sources of zero entropy: rescaling and inserting delays (invited talk)
This page was built for publication: Entropic measures, Markov information sources and complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1855845)