Approximation theory of output statistics
From MaRDI portal
Publication:4277135
DOI10.1109/18.256486zbMath0784.94016OpenAlexW2162635854MaRDI QIDQ4277135
Publication date: 7 February 1994
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/7c3960cd90e30e1c4db03f05ba92a4247158c07e
channel capacityresolvabilityShannon theorychannel output statisticsnoiseless source codingidentification via channelsrandom number generation complexity
Related Items (13)
An information-theoretic analysis of return maximization in reinforcement learning ⋮ Basics of Secrecy Coding ⋮ Common randomness and distributed control: A counterexample ⋮ Universal Features for High-Dimensional Learning and Inference ⋮ Min- and max-entropy in infinite dimensions ⋮ Asymptotic convertibility of entanglement: An information-spectrum approach to entanglement concentration and dilution ⋮ Identification via Quantum Channels ⋮ Second-order converses via reverse hypercontractivity ⋮ Common Information, Noise Stability, and Their Extensions ⋮ On identification ⋮ Over-the-Air computation for distributed machine learning and consensus in large wireless networks ⋮ Second-order asymptotics for the classical capacity of image-additive quantum channels ⋮ On the source-channel separation theorem for infinite source alphabets
This page was built for publication: Approximation theory of output statistics