Entropy measures vs. Kolmogorov complexity (Q657548)

From MaRDI portal
Revision as of 20:19, 19 March 2024 by Openalex240319060354 (talk | contribs) (Set OpenAlex properties.)
scientific article
Language Label Description Also known as
English
Entropy measures vs. Kolmogorov complexity
scientific article

    Statements

    Entropy measures vs. Kolmogorov complexity (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order \(\alpha \), showing that it only holds for \(\alpha = 1\). Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution \(m^{t}(x)\), Tsallis and Rényi entropies converge if and only if \(\alpha \) is greater than 1. We also establish the uniform continuity of these entropies.
    0 references
    Kolmogorov complexity
    0 references
    Shannon entropy
    0 references
    Rényi entropy
    0 references
    Tsallis entropy
    0 references

    Identifiers