Generalized Conditional Entropy — Determinicity of a Process and Rokhlin's Formula
From MaRDI portal
Publication:5744712
DOI10.1142/S1230161215500250zbMath1387.81106MaRDI QIDQ5744712
Publication date: 19 February 2016
Published in: Open Systems & Information Dynamics (Search for Journal in Brave)
Related Items (2)
On local Tsallis entropy of relative dynamical systems ⋮ Notes on use of generalized entropies in counting
Cites Work
- A Mathematical Theory of Communication
- On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies
- On entropy-like invariants for dynamical systems
- On a class of generalized K-entropies and Bernoulli shifts
- Axiomatic characterizations of information measures
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Power-law sensitivity to initial conditions---new entropic representation
- Possible generalization of Boltzmann-Gibbs statistics.
- Tsallis entropy: how unique?
- Fundamental properties of Tsallis relative entropy
- Information theoretical properties of Tsallis entropies
- On Uniqueness Theorems for Tsallis Entropy and Tsallis Relative Entropy
- Rényi Extrapolation of Shannon Entropy
- Entropy computing via integration over fractal measures
- On the Kolmogorov-like generalization of Tsallis entropy, correlation entropies and multifractal analysis
- Further results on generalized conditional entropies
- Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression
- Invariant of dynamical systems: A generalized entropy
- On Rohlin's formula for entropy
- Information-theoretical considerations on estimation problems
- Regularities unseen, randomness observed: Levels of entropy convergence
- On Information and Sufficiency
- Entropic nonextensivity: A possible measure of complexity
This page was built for publication: Generalized Conditional Entropy — Determinicity of a Process and Rokhlin's Formula