Shannon Entropy vs. Kolmogorov Complexity
From MaRDI portal
Publication:3434702
DOI10.1007/11753728_29zbMATH Open1185.68370OpenAlexW1581042959MaRDI QIDQ3434702FDOQ3434702
Authors: Andrej Muchnik, Nikolai K. Vereshchagin
Publication date: 2 May 2007
Published in: Computer Science – Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/11753728_29
Recommendations
- Entropy measures vs. Kolmogorov complexity
- Inequalities for Shannon entropy and Kolmogorov complexity
- A comparison of the Shannon and Kullback information measures
- scientific article; zbMATH DE number 1107583
- Inductive complexity and Shannon entropy
- A note on Kolmogorov complexity and entropy
- Shannon entropy versus Renyi entropy from a cryptographic viewpoint
- scientific article; zbMATH DE number 61024
- Shannon entropy reinterpreted
Measures of information, entropy (94A17) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30)
Cited In (7)
- Retracing some paths in categorical semantics: from process-propositions-as-types to categorified reals and computers
- High entropy random selection protocols
- Combinatorial interpretation of Kolmogorov complexity
- Relating description complexity to entropy
- Entropy measures vs. Kolmogorov complexity
- Inequalities for space-bounded Kolmogorov complexity
- On joint conditional complexity (entropy)
This page was built for publication: Shannon Entropy vs. Kolmogorov Complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3434702)