Decision theory and large deviations for dynamical hypotheses tests: the Neyman-Pearson lemma, min-max and Bayesian tests
From MaRDI portal
Publication:2140235
Abstract: We analyze hypotheses tests using classical results on large deviations to compare two models, each one described by a different H"older Gibbs probability measure. One main difference to the classical hypothesis tests in Decision Theory is that here the two measures are singular with respect to each other. Among other objectives, we are interested in the decay rate of the wrong decisions probability, when the sample size goes to infinity. We show a dynamical version of the Neyman-Pearson Lemma displaying the ideal test within a certain class of similar tests. This test becomes exponentially better, compared to other alternative tests, when the sample size goes to infinity. We are able to present the explicit exponential decay rate. We also consider both, the Min-Max and a certain type of Bayesian hypotheses tests. We shall consider these tests in the log likelihood framework by using several tools of Thermodynamic Formalism. Versions of the Stein's Lemma and Chernoff's information are also presented.
Recommendations
- Dynamical hypothesis tests and decision theory for Gibbs distributions
- Averaging of the Neumann problems for nonlinear elliptic equations in domains with accumulators
- Large-deviation theorems in hypothesis-testing problems
- Publication:4939962
- On many hypotheses logarithmically asymptotically optimal testing via the theory of large deviations
Cites work
- scientific article; zbMATH DE number 3864329 (Why is no real title available?)
- scientific article; zbMATH DE number 3686563 (Why is no real title available?)
- scientific article; zbMATH DE number 49597 (Why is no real title available?)
- scientific article; zbMATH DE number 3551675 (Why is no real title available?)
- scientific article; zbMATH DE number 592670 (Why is no real title available?)
- A Bayesian approach for estimating the parameters of an α-stable distribution
- An iterative process for approximating subactions
- Bayes posterior convergence for loss functions via almost additive thermodynamic formalism
- Binary Hypothesis Testing Game With Training Data
- Circular unitary ensembles: parametric models and their asymptotic maximum likelihood estimates
- Different closed-form expressions for generalized entropy rates of Markov chains
- Entropy and large deviation
- Entropy, large deviations, and statistical mechanics.
- Ergodic optimization, zero temperature limits and the max-plus algebra. Paper from the 29th Brazilian mathematics colloquium -- 29\(^{\text o}\) Colóquio Brasileiro de Matemática, Rio de Janeiro, Brazil, July 22 -- August 2, 2013
- Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level
- Gibbs posterior convergence and the thermodynamic formalism
- Large Deviations in Dynamical Systems and Stochastic Processes
- Large deviations for empirical entropies ofg-measures
- Large deviations techniques and applications.
- Nonequilibrium in thermodynamic formalism: the second law, gases and information geometry
- On entropy production of repeated quantum measurements. I. General theory
- On information gain, Kullback-Leibler divergence, entropy production and the involution kernel
- Power of the likelihood ratio test for models of DNA base substitution
Cited in
(2)
This page was built for publication: Decision theory and large deviations for dynamical hypotheses tests: the Neyman-Pearson lemma, min-max and Bayesian tests
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2140235)