The Individual Ergodic Theorem of Information Theory
From MaRDI portal
Publication:3245606
DOI10.1214/aoms/1177706899zbMath0078.31801OpenAlexW1968250327MaRDI QIDQ3245606
Publication date: 1957
Published in: The Annals of Mathematical Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aoms/1177706899
Related Items (67)
Shannon's theorem for locally compact groups ⋮ Recent progress in ergodic theory ⋮ Entropy Rate and Maximum Entropy Methods for Countable Semi-Markov Chains ⋮ A monotone Sinai theorem ⋮ On convergence of information in spaces with infinite invariant measure ⋮ The Markov approximation of the sequences of \(N\)-valued random variables and a class of small deviation theorems. ⋮ On the ergodic theory of Tanaka-Ito type \(\alpha \)-continued fractions ⋮ UNIVERSAL CODING AND PREDICTION ON ERGODIC RANDOM POINTS ⋮ Strong deviation theorems for general information sources ⋮ An entropy problem of the \(\alpha \)-continued fraction maps ⋮ Bounds on Data Compression Ratio with a Given Tolerable Error Probability ⋮ An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting ⋮ An information-theoretic analysis of return maximization in reinforcement learning ⋮ KERNEL-BASED SEMI-LOG-OPTIMAL EMPIRICAL PORTFOLIO SELECTION STRATEGIES ⋮ Upper Bounds on Mixing Time of Finite Markov Chains ⋮ A limit theorem for the entropy density of nonhomogeneous Markov information source ⋮ THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS ⋮ The generalized entropy ergodic theorem for nonhomogeneous Markov chains ⋮ On the entropy for semi-Markov processes ⋮ On moving averages and asymptotic equipartition of information ⋮ Strong law of large numbers for generalized sample relative entropy of non homogeneous Markov chains ⋮ Measure-theoretic construction for information theory ⋮ Large deviations for non-uniformly expanding maps ⋮ The strong law of large numbers and Shannon-McMillan theorem for Markov chains indexed by an infinite tree with uniformly bounded degree in random environment ⋮ Some generalized strong limit theorems for Markov chains in bi-infinite random environments ⋮ An extension of Shannon-McMillan theorem and some limit properties for nonhomogeneous Markov chains ⋮ An ergodic theorem for constrained sequences of functions ⋮ A note on entropy of finitely ergodic compact topological dynamical systems ⋮ Markov approximation and the generalized entropy ergodic theorem for non-null stationary process ⋮ The asymptotic equipartition property of Markov chains in single infinite Markovian environment on countable state space ⋮ An Operator Ergodic Theorem for Sequences of Functions ⋮ A NOTE ON THE LEARNING-THEORETIC CHARACTERIZATIONS OF RANDOMNESS AND CONVERGENCE ⋮ The generalized entropy ergodic theorem for nonhomogeneous bifurcating Markov chains indexed by a binary tree ⋮ Strong Shannon–McMillan–Breiman’s theorem for locally compact groups ⋮ On entropy localization of doubly stochastic operators ⋮ Entropy and compression: a simple proof of an inequality of Khinchin-Ornstein-Shields ⋮ The Comparison between Arbitrary Information Sources and Memoryless Information Sources and its Limit Properties ⋮ Fuzzy entropy of action of semi-groups ⋮ Bowen entropy of sets of generic points for fixed-point free flows ⋮ Estimating the conditional expectations for continuous time stationary processes ⋮ Strong laws of large numbers for the mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley tree ⋮ A conversation with Leo Breiman. ⋮ The Shannon-McMillan theorem for ergodic quantum lattice systems ⋮ Nonparametric sequential prediction of time series ⋮ An Elementary Derivation of the Large Deviation Rate Function for Finite State <scp>M</scp>arkov Chains ⋮ Relative complexity of random walks in random sceneries ⋮ Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics ⋮ Some limit properties for the \(m\)th-order nonhomogeneous Markov chains indexed by an m rooted Cayley tree ⋮ The Asymptotic Equipartition Property for a Nonhomogeneous Markov Information Source ⋮ Entropy of flows, revisited ⋮ A class of strong limit theorems for countable nonhomogeneous Markov chains on the generalized gambling system ⋮ On universal algorithms for classifying and predicting stationary processes ⋮ A spectral representation for the entropy of topological dynamical systems ⋮ On local metric pressure of dynamical systems ⋮ The strong law of large numbers and the Shannon-McMillan theorem for the mth-order nonhomogeneous Markov chains indexed by an m rooted Cayley tree ⋮ Entropy as an integral operator: erratum and modification ⋮ Singular measures and Hausdorff measures ⋮ Optimal data compression algorithm ⋮ Universal Data Compression Algorithm Based on Approximate String Matching ⋮ Some Research on Shannon–McMillan Theorem formth-Order Nonhomogeneous Markov Information Source ⋮ The Shannon-McMillan-Breiman theorem beyond amenable groups ⋮ Contributions to information theory for abstract alphabets ⋮ Ergodic properties of conditional forecast functions of stable systems ⋮ Tree-indexed Markov chains in random environment and some of their strong limit properties ⋮ Some remarks concerning the individual ergodic theorem of information theory ⋮ Entropy and dimension of disintegrations of stationary measures ⋮ A strong limit theorem for the average of ternary functions of Markov chains in bi-infinite random environments
This page was built for publication: The Individual Ergodic Theorem of Information Theory