In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory
DOI10.1093/BJPS/55.3.411zbMATH Open1077.94507OpenAlexW2124842410MaRDI QIDQ4669332FDOQ4669332
Authors: Roman Frigg
Publication date: 15 April 2005
Published in: The British Journal for the Philosophy of Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1093/bjps/55.3.411
Recommendations
- Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics
- Chaos communication: an overview of exact, optimum and approximate results using statistical theory
- Dynamic justification of the entropy approach for one classof communication systems
- Measuring information beyond communication theory - why some generalized information measures may be useful, others not
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- scientific article; zbMATH DE number 1377734
- Chaos and randomness: An equivalence proof of a generalized version of the Shannon entropy and the Kolmogorov-Sinai entropy for Hamiltonian dynamical systems
- Chaos communication performance: theory and computation
Strange attractors, chaotic dynamics of systems with hyperbolic behavior (37D45) Measures of information, entropy (94A17) Entropy and other invariants, isomorphism, classification in ergodic theory (37A35) Complex behavior and chaotic systems of ordinary differential equations (34C28) Foundations of time-dependent statistical mechanics (82C03)
Cited In (15)
- On entropy rates of dynamical systems and Gaussian processes
- Universal Relation between the Kolmogorov-Sinai Entropy and the Thermodynamical Entropy in Simple Liquids
- Nonlinear dynamics of two-dimensional cardiac action potential duration mapping model with memory
- Title not available (Why is that?)
- Convergence of the \(K_ 2\) entropy for random noises with power law spectra
- Entropy and algorithmic complexity in quantum information theory
- Kolmogorov–Sinai entropy and black holes
- Classical, quantum and biological randomness as relative unpredictability
- The ergodic hierarchy, randomness and Hamiltonian chaos
- A brief introduction to observational entropy
- Maximum Kolmogorov-Sinai entropy versus minimum mixing time in Markov chains
- Stable regularities without governing laws?
- Chaos and randomness: An equivalence proof of a generalized version of the Shannon entropy and the Kolmogorov-Sinai entropy for Hamiltonian dynamical systems
- Kolmogorov-Sinai entropy from recurrence times
- On relations among the entropic chaos degree, the Kolmogorov-Sinai entropy and the Lyapunov exponent
This page was built for publication: In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4669332)