Entropy measures vs. Kolmogorov complexity (Q657548): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e13030595 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2080341202 / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q62038777 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An introduction to Kolmogorov complexity and its applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / cites work
 
Property / cites work: Possible generalization of Boltzmann-Gibbs statistics. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Kolmogorov's contributions to information theory and algorithmic complexity / rank
 
Normal rank
Property / cites work
 
Property / cites work: A formal theory of inductive inference. Part I / rank
 
Normal rank
Property / cites work
 
Property / cites work: Three approaches to the quantitative definition of information<sup>*</sup> / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Length of Programs for Computing Finite Binary Sequences / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4074808 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Thermodynamic stability conditions for the Tsallis and Rényi entropies / rank
 
Normal rank
Property / cites work
 
Property / cites work: Uniform estimates on the Tsallis entropies / rank
 
Normal rank

Latest revision as of 19:15, 4 July 2024

scientific article
Language Label Description Also known as
English
Entropy measures vs. Kolmogorov complexity
scientific article

    Statements

    Entropy measures vs. Kolmogorov complexity (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order \(\alpha \), showing that it only holds for \(\alpha = 1\). Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution \(m^{t}(x)\), Tsallis and Rényi entropies converge if and only if \(\alpha \) is greater than 1. We also establish the uniform continuity of these entropies.
    0 references
    Kolmogorov complexity
    0 references
    Shannon entropy
    0 references
    Rényi entropy
    0 references
    Tsallis entropy
    0 references

    Identifiers