On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies
From MaRDI portal
Publication:296289
DOI10.3390/e16073732zbMath1338.94036arXiv1302.6403OpenAlexW2271169255WikidataQ51857455 ScholiaQ51857455MaRDI QIDQ296289
Publication date: 15 June 2016
Published in: Entropy, Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.6403
Dynamical aspects of measure-preserving transformations (37A05) Entropy and other invariants, isomorphism, classification in ergodic theory (37A35) Measures of information, entropy (94A17)
Related Items (5)
On local Tsallis entropy of relative dynamical systems ⋮ Tsallis entropy of partitions in quantum logics ⋮ Generalized Conditional Entropy — Determinicity of a Process and Rokhlin's Formula ⋮ A unified time scale for quantum chaotic regimes ⋮ On entropy, entropy-like quantities, and applications
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On entropy-like invariants for dynamical systems
- On a class of generalized K-entropies and Bernoulli shifts
- Axiomatic characterizations of information measures
- Entropy is the only finitely observable invariant
- Toward a quantitative theory of self-generated complexity
- Possible entropy functions
- Global and local complexity in weakly chaotic dynamical systems
- Rényi entropies of aperiodic dynamical systems
- Power-law sensitivity to initial conditions---new entropic representation
- Possible generalization of Boltzmann-Gibbs statistics.
- Tsallis entropy: how unique?
- Entropy dimensions and a class of constructive examples
- Possible rates of entropy convergence
- Information theoretical properties of Tsallis entropies
- Computation in Sofic Quantum Dynamical Systems
- Generalized entropies: Rényi and correlation integral approach
- SYNCHRONIZING TO THE ENVIRONMENT: INFORMATION-THEORETIC CONSTRAINTS ON AGENT LEARNING
- On the Kolmogorov-like generalization of Tsallis entropy, correlation entropies and multifractal analysis
- Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression
- Invariant of dynamical systems: A generalized entropy
- Information-theoretical considerations on estimation problems
- Regularities unseen, randomness observed: Levels of entropy convergence
- Subadditive functions
- Entropic nonextensivity: A possible measure of complexity
This page was built for publication: On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies