Entropy measures vs. Kolmogorov complexity (Q657548)

From MaRDI portal





scientific article; zbMATH DE number 5995938
Language Label Description Also known as
default for all languages
No label defined
    English
    Entropy measures vs. Kolmogorov complexity
    scientific article; zbMATH DE number 5995938

      Statements

      Entropy measures vs. Kolmogorov complexity (English)
      0 references
      0 references
      0 references
      0 references
      0 references
      0 references
      9 January 2012
      0 references
      Summary: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order \(\alpha \), showing that it only holds for \(\alpha = 1\). Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution \(m^{t}(x)\), Tsallis and Rényi entropies converge if and only if \(\alpha \) is greater than 1. We also establish the uniform continuity of these entropies.
      0 references
      Kolmogorov complexity
      0 references
      Shannon entropy
      0 references
      Rényi entropy
      0 references
      Tsallis entropy
      0 references

      Identifiers