The quantum relative entropy as a rate function and information criteria
From MaRDI portal
(Redirected from Publication:356910)
Abstract: We prove that the quantum relative entropy is a rate function in large deviation principle. Next, we define information criteria for quantum states and estimate the accuracy of the use of them. Most of the results in this paper are essentially based on Hiai-Ohya-Tsukada theorem.
Recommendations
- scientific article; zbMATH DE number 2103533
- The proper formula for relative entropy and its asymptotics in quantum probability
- Extremal properties of relative entropy in quantum statistical mechanics
- A generalization of quantum Stein's lemma
- Characterization of the relative entropy of states of matrix algebras
Cites work
- scientific article; zbMATH DE number 1579275 (Why is no real title available?)
- scientific article; zbMATH DE number 4174026 (Why is no real title available?)
- scientific article; zbMATH DE number 3167426 (Why is no real title available?)
- scientific article; zbMATH DE number 3680516 (Why is no real title available?)
- scientific article; zbMATH DE number 3725098 (Why is no real title available?)
- scientific article; zbMATH DE number 3783731 (Why is no real title available?)
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- scientific article; zbMATH DE number 1120256 (Why is no real title available?)
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- A Generalized Bayes Rule for Prediction
- A Unified Scheme for Generalized Sectors Based on Selection Criteria: Order Parameters of Symmetries and of Thermality and Physical Meanings of Adjunctions
- A Unified Scheme of Measurement and Amplification Processes Based on Micro-Macro Duality — Stern-Gerlach Experiment as a Typical Example
- A new look at the statistical model identification
- A quantum version of Sanov's theorem
- A simple proof of Sanov's theorem
- A variational expression for the relative entropy
- Algebraic Geometry and Statistical Learning Theory
- An Information-Spectrum Approach to Classical and Quantum Hypothesis Testing for Simple Hypotheses
- Asymptotic equivalence of Bayes cross validation and widely applicable information criterion in singular learning theory
- Asymptotic of Varadhan-type and the Gibbs variational principle
- Entropic Fluctuations in Quantum Statistical Mechanics. An Introduction
- Goodness of prediction fit
- Large deviation strategy for inverse problem. I
- Large deviation strategy for inverse problem. II.
- Large deviations in quantum spin chains
- Lectures on algebraic statistics
- Micro-macro duality in quantum physics
- On Error Exponents in Quantum Hypothesis Testing
- On the Foundations of Information Theory
- Quantum \(f\)-divergences and error correction
- Quantum detection and estimation theory
- Quantum hypothesis testing and non-equilibrium statistical mechanics
- Quantum perfect correlations
- Quasi-entropies for states of a von Neumann algebra
- Relative entropy and the Wigner-Yanase-Dyson-Lieb concavity in an interpolation theory
- Relative entropy for states of von Neumann algebras. II
- Strong converse and Stein's lemma in quantum hypothesis testing
- Sufficiency and relative entropy in *-algebras with applications in quantum systems
- Sufficiency in quantum statistical inference
- Testing Statistical Hypotheses
- The Chernoff lower bound for symmetric quantum hypothesis testing
- The proper formula for relative entropy and its asymptotics in quantum probability
- Theory of operator algebras I.
Cited in
(4)
This page was built for publication: The quantum relative entropy as a rate function and information criteria
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q356910)