Asymptotic properties of Neyman-Pearson tests for infinite Kullback- Leibler information (Q1095523)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Asymptotic properties of Neyman-Pearson tests for infinite Kullback- Leibler information
scientific article

    Statements

    Asymptotic properties of Neyman-Pearson tests for infinite Kullback- Leibler information (English)
    0 references
    1986
    0 references
    Classical results of Stein, Chernoff and Rao concerning the rate of convergence of second kind error probabilities \(\beta_{\alpha,n}\) are generalized for the case when the Kullback-Leibler information \(K(P_ 0,P_ 1)\) is infinite. The main result is the following: Let \(q_{\alpha,n}\) denote the logarithm of the critical value of the Neyman-Pearson test of size \(\alpha\) when testing H: P\(=P_ 0\) against K: P\(=P_ 1\) for the sample size n. Moreover let F be the distribution function of \(\log (dP_ 1/dP_ 0)\) with respect to \(P_ 0\). If F is positive and \(\limsup_{x\to -\infty}[F(\lambda_ 0x)/F(x)]<1\) for some \(\lambda_ 0>1\), then \(\lim_{n\to \infty}\beta_{\alpha,n}^{(1/q_{\alpha,n})}=\exp \{-1\}.\) From this follows the interesting fact that in contrast to the classical results the rate of convergence of second kind error probabilities and critical values depends on the level \(\alpha\) if \(K(P_ 0,P_ 1)=\infty.\) Moreover the relation between \(q_{\alpha,n}\) and the local behaviour of the Laplace transform of F is studied. An application to a one-sided test problem is discussed in details.
    0 references
    exponential families
    0 references
    log-likelihood distribution
    0 references
    rate of convergence of second kind error probabilities
    0 references
    Kullback-Leibler information
    0 references
    critical value
    0 references
    Neyman-Pearson test
    0 references
    Laplace transform
    0 references
    0 references

    Identifiers