The method of types [information theory]
From MaRDI portal
Publication:4701200
DOI10.1109/18.720546zbMath0933.94012OpenAlexW4385773637MaRDI QIDQ4701200
Publication date: 21 November 1999
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.720546
hypothesis testinglarge deviations theoryerror exponentschannelsShannon theorymultiusermethod of types
Hypothesis testing in multivariate analysis (62H15) Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Information theory (general) (94A15) Channel models (including quantum) in information and communication theory (94A40) Coding theorems (Shannon theory) (94A24) Source coding (94A29)
Related Items
Multivariate trace inequalities, p-fidelity, and universal recovery beyond tracial settings ⋮ Achieving positive rates with predetermined dictionaries ⋮ Asymptotic equivalence of empirical likelihood and Bayesian MAP ⋮ Weighted approximate Bayesian computation via Sanov's theorem ⋮ An information-theoretic analysis of return maximization in reinforcement learning ⋮ Concentration of the collision estimator ⋮ Discriminating quantum states: the multiple Chernoff distance ⋮ On the second-order asymptotics for entanglement-assisted communication ⋮ Unnamed Item ⋮ A lower bound on the quantum capacity of channels with correlated errors ⋮ An information-theoretic model for steganography ⋮ On the VC-Dimension of Binary Codes ⋮ Equivalence and nonequivalence of ensembles: thermodynamic, macrostate, and measure levels ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ The Pólya information divergence ⋮ Finite Blocklength Lossy Source Coding for Discrete Memoryless Sources ⋮ Optimal non-asymptotic concentration of centered empirical relative entropy in the high-dimensional regime ⋮ Asymptotic values of the Hall-ratio for graph powers ⋮ Asymptotic dependency structure of multiple signals ⋮ State estimation via limited capacity noisy communication channels ⋮ The problem of stabilization of networked systems under computational power constraints ⋮ Multivariate trace inequalities ⋮ Multiple Objects: Error Exponents in Hypotheses Testing and Identification ⋮ An Elementary Derivation of the Large Deviation Rate Function for Finite State <scp>M</scp>arkov Chains ⋮ Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics ⋮ The asymptotic equipartition property in reinforcement learning and its relation to return maximization ⋮ Constructing perfect steganographic systems ⋮ Log-efficient search for significant inputs of linear model ⋮ Asymptotically optimal perfect steganographic systems ⋮ Correlation detection and an operational interpretation of the Rényi mutual information ⋮ ZERO-KNOWLEDGE BLACKBOX TESTING: WHERE ARE THE FAULTS? ⋮ Testing of Hypothesis and Identification ⋮ Formalization of Shannon's theorems ⋮ Properties of noncommutative Rényi and Augustin information ⋮ T-square tensors. I: Inequalities ⋮ Second-order asymptotics for the classical capacity of image-additive quantum channels