Entropy measures vs. Kolmogorov complexity
From MaRDI portal
Publication:657548
DOI10.3390/E13030595zbMATH Open1229.94037OpenAlexW2080341202WikidataQ62038777 ScholiaQ62038777MaRDI QIDQ657548FDOQ657548
Authors: André Souto, Luís Antunes, Andreia Sofia Teixeira, Armando B. Matos
Publication date: 9 January 2012
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e13030595
Recommendations
Cites Work
- A Mathematical Theory of Communication
- Possible generalization of Boltzmann-Gibbs statistics.
- A formal theory of inductive inference. Part I
- Title not available (Why is that?)
- On the Length of Programs for Computing Finite Binary Sequences
- Three approaches to the quantitative definition of information*
- An introduction to Kolmogorov complexity and its applications
- Uniform estimates on the Tsallis entropies
- Thermodynamic stability conditions for the Tsallis and Rényi entropies
- Kolmogorov's contributions to information theory and algorithmic complexity
Cited In (6)
- Entropy of theK-Satisfiability Problem
- The Complexity of Approximating the Entropy
- New statistical models of nonergodic cognitive systems and their pathologies
- Relating description complexity to entropy
- Shannon Entropy vs. Kolmogorov Complexity
- An entropy based measure for comparing distributions of complexity
This page was built for publication: Entropy measures vs. Kolmogorov complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q657548)