Information meaning of entropy of nonergodic measures (Q2313324)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Information meaning of entropy of nonergodic measures
scientific article

    Statements

    Information meaning of entropy of nonergodic measures (English)
    0 references
    19 July 2019
    0 references
    The main aim of this paper is to study the limit frequency properties of trajectories of the simplest dynamical system generated by the left shift on the space of sequences of letters from a finite alphabet. More precisely, a modification of the Shannon-McMillan-Breiman theorem is proved: for any invariant (not necessarily ergodic) probability measure \(\mu\) on the sequence space, the logarithm of the cardinality of the set of all \(\mu\)-typical sequences of length \(n\) is \(nh(\mu)\), where \(h(\mu)\) is the entropy of the measure \(\mu\). Here a typical finite sequence of letters is understood as a sequence such that the empirical measure generated by it is close to \(\mu\) (in the weak topology). In Section 1, the definitions of entropy and empirical measures are given and the main theorem (Theorem 1) is stated. The next section provides several auxiliary lemmas needed to prove the main theorem. After that, the main theorem is proved in Section 3. A majority of auxiliary lemmas are known, but to make the presentation complete and self-contained, the author presents their proofs in Section 4.
    0 references
    measure of information
    0 references
    entropy
    0 references
    Shannon-McMillan-Breiman theorem
    0 references
    non-ergodic measure
    0 references
    typical finite sequence of letters
    0 references
    0 references

    Identifiers