A limit theorem for the entropy density of nonhomogeneous Markov information source (Q1347196): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Set OpenAlex properties.
 
(4 intermediate revisions by 3 users not shown)
Property / author
 
Property / author: Wen Liu / rank
Normal rank
 
Property / author
 
Property / author: Wei-guo Yang / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Andrei I. Volodin / rank
Normal rank
 
Property / author
 
Property / author: Wen Liu / rank
 
Normal rank
Property / author
 
Property / author: Wei-guo Yang / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Andrei I. Volodin / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3794956 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Individual Ergodic Theorem of Information Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Note on the Ergodic Theorem of Information Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: A counterexample to Perez's generalization of the Shannon-McMillan theorem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Relative entropy densities and a class of limit theorems of the sequence of m-valued random variables / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Basic Theorems of Information Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/0167-7152(94)00080-r / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1996682477 / rank
 
Normal rank

Latest revision as of 10:22, 30 July 2024

scientific article
Language Label Description Also known as
English
A limit theorem for the entropy density of nonhomogeneous Markov information source
scientific article

    Statements

    A limit theorem for the entropy density of nonhomogeneous Markov information source (English)
    0 references
    23 April 1995
    0 references
    A variant of strong law of large numbers for relative entropy density which holds for an arbitrary nonhomogeneous Markov information source with the alphabet \(S = \{1,2, \dots, m\}\) is proved. This is an important question in the information theory. If \(\{X_n, n \geq 0\}\) is a sequence of successive letters produced by mentioned source with initial distribution \((p(1), p(2), \dots, p(m))\) and transition matrix \(P_n = (p_n (i,j))\), \(i, j \in S\), \(n \geq 1\), where \(p_n(i,j) = P(X_n = j\mid X_{n-1} = i)\), then relative density is defined as \[ f_n(\omega) = -(1/n) \biggl[\log p (X_0) + \sum^n_{k=1} \log p_k(X_{k-1}, X_k) \biggr] \] and entropy of initial distribution is defined as \[ H(p_1, p_2,\dots, p_n) = -\sum^n_{k=1} p_k \log p_k, \] where log is natural logarithm. The main result is that almost sure \[ \lim_{n \to \infty} \{f_n(\omega) - (1/n) H [p_k (X_{k-1},1), \dots, p_k(X_{k-1},m)]\} = 0. \]
    0 references
    0 references
    0 references
    0 references
    0 references
    strong law of large numbers
    0 references
    relative entropy density
    0 references
    entropy of initial distribution
    0 references
    0 references
    0 references
    0 references