The Basic Theorems of Information Theory

From MaRDI portal
Publication:5818850

DOI10.1214/aoms/1177729028zbMath0050.35501OpenAlexW1970091448WikidataQ114830681 ScholiaQ114830681MaRDI QIDQ5818850

B. McMillan

Publication date: 1953

Published in: The Annals of Mathematical Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1214/aoms/1177729028



Related Items

Shannon's theorem for locally compact groups, Recent progress in ergodic theory, Entropy Rate and Maximum Entropy Methods for Countable Semi-Markov Chains, On convergence of information in spaces with infinite invariant measure, The Markov approximation of the sequences of \(N\)-valued random variables and a class of small deviation theorems., Equivalence of the maximum likelihood estimator to a minimum entropy estimator, Statistical mechanics and information-theoretic perspectives on complexity in the Earth system, Strong deviation theorems for general information sources, An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting, To the history of the appearance of the notion of the \(\varepsilon\)-entropy of an authomorphism of a Lebesgue space and \((\varepsilon,T)\)-entropy of a dynamical system with continuous time, THE SHANNON–MCMILLAN THEOREM FOR MARKOV CHAINS INDEXED BY A CAYLEY TREE IN RANDOM ENVIRONMENT, An information-theoretic analysis of return maximization in reinforcement learning, Upper Bounds on Mixing Time of Finite Markov Chains, A limit theorem for the entropy density of nonhomogeneous Markov information source, THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS, Entropy for semi-Markov processes with Borel state spaces: asymptotic equirepartition properties and invariance principles, The generalized entropy ergodic theorem for nonhomogeneous Markov chains, Basics of Secrecy Coding, Unnamed Item, A Shannon-McMillan theorem for motley names, On the entropy for semi-Markov processes, On moving averages and asymptotic equipartition of information, Strong law of large numbers for generalized sample relative entropy of non homogeneous Markov chains, Large deviations for non-uniformly expanding maps, The strong law of large numbers and Shannon-McMillan theorem for Markov chains indexed by an infinite tree with uniformly bounded degree in random environment, Some generalized strong limit theorems for Markov chains in bi-infinite random environments, Unnamed Item, An extension of Shannon-McMillan theorem and some limit properties for nonhomogeneous Markov chains, A note on entropy of finitely ergodic compact topological dynamical systems, Markov approximation and the generalized entropy ergodic theorem for non-null stationary process, Entropy of action of semi-groups, Entropy statistic theorem and variational principle for \(t\)-entropy are equivalent, A local approach to g-entropy, Unnamed Item, The generalized entropy ergodic theorem for nonhomogeneous bifurcating Markov chains indexed by a binary tree, Strong Shannon–McMillan–Breiman’s theorem for locally compact groups, On entropy localization of doubly stochastic operators, Entropy and compression: a simple proof of an inequality of Khinchin-Ornstein-Shields, The Comparison between Arbitrary Information Sources and Memoryless Information Sources and its Limit Properties, Fuzzy entropy of action of semi-groups, Logarithmic expansion, entropy, and dimension for set-valued maps, Bowen entropy of sets of generic points for fixed-point free flows, Why use population entropy? It determines the rate of convergence, Strong laws of large numbers for the mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley tree, On the stochastic representation and Markov approximation of Hamiltonian systems, A Unified Theory of Neuro-MRI Data Shows Scale-Free Nature of Connectivity Modes, A strong limit theorem for functions of continuous random variables and an extension of the Shannon-McMillan theorem, THE ASYMPTOTIC EQUIPARTITION PROPERTY FOR ASYMPTOTIC CIRCULAR MARKOV CHAINS, Coding Markov chains from the past, An Elementary Derivation of the Large Deviation Rate Function for Finite State <scp>M</scp>arkov Chains, Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics, Absolute continuity of information channels, Entropy of the symbolic sequence for critical circle maps, PARALLELIZING GRAMMATICAL FUNCTIONS: P600 AND P345 REFLECT DIFFERENT COST OF REANALYSIS, Entropy Analysis of Noise Contaminated Sequences, Some limit properties for the \(m\)th-order nonhomogeneous Markov chains indexed by an m rooted Cayley tree, Estimation of the Entropy Rate of a Countable Markov Chain, Guessing probability distributions from small samples, Concentration of the information in data with log-concave distributions, Unnamed Item, Complementarity in classical dynamical systems, The Asymptotic Equipartition Property for a Nonhomogeneous Markov Information Source, A class of strong limit theorems for countable nonhomogeneous Markov chains on the generalized gambling system, A spectral representation for the entropy of topological dynamical systems, On local metric pressure of dynamical systems, On Time-Free Functions, The strong law of large numbers and the Shannon-McMillan theorem for the mth-order nonhomogeneous Markov chains indexed by an m rooted Cayley tree, Entropy as an integral operator: erratum and modification, Classical capacity of quantum channels with general Markovian correlated noise, A class of small deviation theorems for the random variables associated withmth-order asymptotic circular Markov chains, Some Research on Shannon–McMillan Theorem formth-Order Nonhomogeneous Markov Information Source, The Shannon-McMillan-Breiman theorem beyond amenable groups, On the basic theorems of information theory, Über die Struktur der mittleren Entropie, Universal Neural Field Computation, Contributions to information theory for abstract alphabets, Metric entropy for set-valued maps, Tree-indexed Markov chains in random environment and some of their strong limit properties, On Hudetz entropy localization, Unnamed Item, Compactification of the stationary channel space., The dimension of some sets defined in terms of f-expansions, A strong limit theorem for the average of ternary functions of Markov chains in bi-infinite random environments, Unnamed Item, On local entropy of fuzzy partitions, Finite sample effects in sequence analysis, Sup-sums principles for \(F\)-divergence and a new definition for \(t\)-entropy, Topological entropy for set-valued maps