The Individual Ergodic Theorem of Information Theory

From MaRDI portal
Publication:3245606

DOI10.1214/aoms/1177706899zbMath0078.31801OpenAlexW1968250327MaRDI QIDQ3245606

Leo Breiman

Publication date: 1957

Published in: The Annals of Mathematical Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1214/aoms/1177706899




Related Items (67)

Shannon's theorem for locally compact groupsRecent progress in ergodic theoryEntropy Rate and Maximum Entropy Methods for Countable Semi-Markov ChainsA monotone Sinai theoremOn convergence of information in spaces with infinite invariant measureThe Markov approximation of the sequences of \(N\)-valued random variables and a class of small deviation theorems.On the ergodic theory of Tanaka-Ito type \(\alpha \)-continued fractionsUNIVERSAL CODING AND PREDICTION ON ERGODIC RANDOM POINTSStrong deviation theorems for general information sourcesAn entropy problem of the \(\alpha \)-continued fraction mapsBounds on Data Compression Ratio with a Given Tolerable Error ProbabilityAn optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional settingAn information-theoretic analysis of return maximization in reinforcement learningKERNEL-BASED SEMI-LOG-OPTIMAL EMPIRICAL PORTFOLIO SELECTION STRATEGIESUpper Bounds on Mixing Time of Finite Markov ChainsA limit theorem for the entropy density of nonhomogeneous Markov information sourceTHE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINSThe generalized entropy ergodic theorem for nonhomogeneous Markov chainsOn the entropy for semi-Markov processesOn moving averages and asymptotic equipartition of informationStrong law of large numbers for generalized sample relative entropy of non homogeneous Markov chainsMeasure-theoretic construction for information theoryLarge deviations for non-uniformly expanding mapsThe strong law of large numbers and Shannon-McMillan theorem for Markov chains indexed by an infinite tree with uniformly bounded degree in random environmentSome generalized strong limit theorems for Markov chains in bi-infinite random environmentsAn extension of Shannon-McMillan theorem and some limit properties for nonhomogeneous Markov chainsAn ergodic theorem for constrained sequences of functionsA note on entropy of finitely ergodic compact topological dynamical systemsMarkov approximation and the generalized entropy ergodic theorem for non-null stationary processThe asymptotic equipartition property of Markov chains in single infinite Markovian environment on countable state spaceAn Operator Ergodic Theorem for Sequences of FunctionsA NOTE ON THE LEARNING-THEORETIC CHARACTERIZATIONS OF RANDOMNESS AND CONVERGENCEThe generalized entropy ergodic theorem for nonhomogeneous bifurcating Markov chains indexed by a binary treeStrong Shannon–McMillan–Breiman’s theorem for locally compact groupsOn entropy localization of doubly stochastic operatorsEntropy and compression: a simple proof of an inequality of Khinchin-Ornstein-ShieldsThe Comparison between Arbitrary Information Sources and Memoryless Information Sources and its Limit PropertiesFuzzy entropy of action of semi-groupsBowen entropy of sets of generic points for fixed-point free flowsEstimating the conditional expectations for continuous time stationary processesStrong laws of large numbers for the mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley treeA conversation with Leo Breiman.The Shannon-McMillan theorem for ergodic quantum lattice systemsNonparametric sequential prediction of time seriesAn Elementary Derivation of the Large Deviation Rate Function for Finite State <scp>M</scp>arkov ChainsRelative complexity of random walks in random sceneriesShannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physicsSome limit properties for the \(m\)th-order nonhomogeneous Markov chains indexed by an m rooted Cayley treeThe Asymptotic Equipartition Property for a Nonhomogeneous Markov Information SourceEntropy of flows, revisitedA class of strong limit theorems for countable nonhomogeneous Markov chains on the generalized gambling systemOn universal algorithms for classifying and predicting stationary processesA spectral representation for the entropy of topological dynamical systemsOn local metric pressure of dynamical systemsThe strong law of large numbers and the Shannon-McMillan theorem for the mth-order nonhomogeneous Markov chains indexed by an m rooted Cayley treeEntropy as an integral operator: erratum and modificationSingular measures and Hausdorff measuresOptimal data compression algorithmUniversal Data Compression Algorithm Based on Approximate String MatchingSome Research on Shannon–McMillan Theorem formth-Order Nonhomogeneous Markov Information SourceThe Shannon-McMillan-Breiman theorem beyond amenable groupsContributions to information theory for abstract alphabetsErgodic properties of conditional forecast functions of stable systemsTree-indexed Markov chains in random environment and some of their strong limit propertiesSome remarks concerning the individual ergodic theorem of information theoryEntropy and dimension of disintegrations of stationary measuresA strong limit theorem for the average of ternary functions of Markov chains in bi-infinite random environments




This page was built for publication: The Individual Ergodic Theorem of Information Theory