Renyi's divergence and entropy rates for finite alphabet Markov sources
From MaRDI portal
Publication:4544589
DOI10.1109/18.923736zbMath1016.94010OpenAlexW2006624419MaRDI QIDQ4544589
Fady Alajaji, L. Lorne Campbell, Ziad S. Rached
Publication date: 4 August 2002
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.923736
Perron-Frobenius theoryRényi divergence ratesource codingKullback-Leibler divergence rateRényi's entropy ratestime-invariant Markov sources
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Source coding (94A29)
Related Items (7)
A NOTE ON RÉNYI'S ENTROPY RATE FOR TIME-INHOMOGENEOUS MARKOV CHAINS ⋮ Some properties of Rényi entropy and Rényi entropy rate ⋮ Different closed-form expressions for generalized entropy rates of Markov chains ⋮ On discrete-time multiallelic evolutionary dynamics driven by selection ⋮ Odd-Burr generalized family of distributions with some applications ⋮ Geometric reduction for identity testing of reversible Markov chains ⋮ Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics
This page was built for publication: Renyi's divergence and entropy rates for finite alphabet Markov sources