A limit theorem for the entropy density of nonhomogeneous Markov information source
From MaRDI portal
Publication:1347196
DOI10.1016/0167-7152(94)00080-RzbMath0833.60034OpenAlexW1996682477MaRDI QIDQ1347196
Publication date: 23 April 1995
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0167-7152(94)00080-r
Related Items (3)
A limit theorem for partial sums of random variables and its applications. ⋮ Strong law of large numbers for Markov chains field on a Bethe tree ⋮ A strong limit theorem for the average of ternary functions of Markov chains in bi-infinite random environments
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Relative entropy densities and a class of limit theorems of the sequence of m-valued random variables
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- A counterexample to Perez's generalization of the Shannon-McMillan theorem
- The Individual Ergodic Theorem of Information Theory
- A Note on the Ergodic Theorem of Information Theory
- The Basic Theorems of Information Theory
This page was built for publication: A limit theorem for the entropy density of nonhomogeneous Markov information source