A limit theorem for the entropy density of nonhomogeneous Markov information source (Q1347196)

From MaRDI portal
Revision as of 02:44, 19 July 2023 by Importer (talk | contribs) (‎Created a new Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
A limit theorem for the entropy density of nonhomogeneous Markov information source
scientific article

    Statements

    A limit theorem for the entropy density of nonhomogeneous Markov information source (English)
    0 references
    0 references
    0 references
    23 April 1995
    0 references
    A variant of strong law of large numbers for relative entropy density which holds for an arbitrary nonhomogeneous Markov information source with the alphabet \(S = \{1,2, \dots, m\}\) is proved. This is an important question in the information theory. If \(\{X_n, n \geq 0\}\) is a sequence of successive letters produced by mentioned source with initial distribution \((p(1), p(2), \dots, p(m))\) and transition matrix \(P_n = (p_n (i,j))\), \(i, j \in S\), \(n \geq 1\), where \(p_n(i,j) = P(X_n = j\mid X_{n-1} = i)\), then relative density is defined as \[ f_n(\omega) = -(1/n) \biggl[\log p (X_0) + \sum^n_{k=1} \log p_k(X_{k-1}, X_k) \biggr] \] and entropy of initial distribution is defined as \[ H(p_1, p_2,\dots, p_n) = -\sum^n_{k=1} p_k \log p_k, \] where log is natural logarithm. The main result is that almost sure \[ \lim_{n \to \infty} \{f_n(\omega) - (1/n) H [p_k (X_{k-1},1), \dots, p_k(X_{k-1},m)]\} = 0. \]
    0 references
    strong law of large numbers
    0 references
    relative entropy density
    0 references
    entropy of initial distribution
    0 references

    Identifiers