A limit theorem for the entropy density of nonhomogeneous Markov information source (Q1347196): Difference between revisions
From MaRDI portal
ReferenceBot (talk | contribs) Changed an Item |
Set OpenAlex properties. |
||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1016/0167-7152(94)00080-r / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W1996682477 / rank | |||
Normal rank |
Latest revision as of 09:22, 30 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A limit theorem for the entropy density of nonhomogeneous Markov information source |
scientific article |
Statements
A limit theorem for the entropy density of nonhomogeneous Markov information source (English)
0 references
23 April 1995
0 references
A variant of strong law of large numbers for relative entropy density which holds for an arbitrary nonhomogeneous Markov information source with the alphabet \(S = \{1,2, \dots, m\}\) is proved. This is an important question in the information theory. If \(\{X_n, n \geq 0\}\) is a sequence of successive letters produced by mentioned source with initial distribution \((p(1), p(2), \dots, p(m))\) and transition matrix \(P_n = (p_n (i,j))\), \(i, j \in S\), \(n \geq 1\), where \(p_n(i,j) = P(X_n = j\mid X_{n-1} = i)\), then relative density is defined as \[ f_n(\omega) = -(1/n) \biggl[\log p (X_0) + \sum^n_{k=1} \log p_k(X_{k-1}, X_k) \biggr] \] and entropy of initial distribution is defined as \[ H(p_1, p_2,\dots, p_n) = -\sum^n_{k=1} p_k \log p_k, \] where log is natural logarithm. The main result is that almost sure \[ \lim_{n \to \infty} \{f_n(\omega) - (1/n) H [p_k (X_{k-1},1), \dots, p_k(X_{k-1},m)]\} = 0. \]
0 references
strong law of large numbers
0 references
relative entropy density
0 references
entropy of initial distribution
0 references
0 references
0 references