On large deviation theorem for data-driven Neyman's statistic (Q1977641): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Testing Uniformity Via Log-Spline Modeling / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized intermediate efficiency of goodness-of-fit tests. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Data driven smooth tests for composite hypotheses / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4246938 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Asymptotic optimality of data-driven Neyman's tests for uniformity / rank
 
Normal rank
Property / cites work
 
Property / cites work: Consistency and Monte Carlo simulation of a data driven version of smooth goodness-of-fit tests / rank
 
Normal rank
Property / cites work
 
Property / cites work: Data-Driven Smooth Tests When the Hypothesis Is Composite / rank
 
Normal rank
Property / cites work
 
Property / cites work: Data-Driven Version of Neyman's Smooth Test of Fit / rank
 
Normal rank
Property / cites work
 
Property / cites work: A probability inequality for obtaining lower bounds in the large deviation principle / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5769351 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5533878 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Sums of Random Vectors / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3253942 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimating the dimension of a model / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exponential inequalities for sums of random vectors / rank
 
Normal rank

Latest revision as of 16:06, 29 May 2024

scientific article
Language Label Description Also known as
English
On large deviation theorem for data-driven Neyman's statistic
scientific article

    Statements

    On large deviation theorem for data-driven Neyman's statistic (English)
    0 references
    0 references
    19 September 2000
    0 references
    Let \(U_1, U_2,...\) be i.i.d. random variables uniformly distributed on the unit interval. Let \(\Phi_0, \Phi_2,...\), \(\Phi_0\equiv 1\), be a complete orthogonal system in \(L_2[0,1]\). Let \((m_n)\) be a a sequence of natural nambers such that \[ (A2)\qquad m_m\to \infty\;\text{as} n\to \infty,\;\text{and} m_n= o\Bigl((n/log n))^{1/(2\omega-2)}\Bigr)\;\text{for some} \omega\geq 0. \] Then define for every \(n\): \[ S2=\min\{k\leq m_n: |\overline{\Phi}|_k^2-k(log n/n)\geq \max_{1\leq j\leq m_n}\{|\overline{\Phi}|_j^2-j(log n/n)\}\}, \] \[ \text{where}\qquad |\overline{\Phi}|_k^2=n^{-2}\sum_{j=1}^{k} \Bigl(\sum_{i=1}^{n}\Phi_j(U_i)\Bigr)^2. \] Then the data-driven Neyman's statistic is \[ T_{S2}=n|\overline{\Phi}|_{S2}^2=n^{-1}\sum_{j=1}^{S2} \Bigl(\sum_{i=1}^{n}\Phi_j(U_i)\Bigr)^2. \] In fact \(T_{S2}\) is completely defined by the choice of a basis \(\Phi\) and a control sequence \((m_n)\). The main theorem of the paper is \textbf{Theorem 1}. If \((A_2)\) is satisfied and \(\Phi\) is a Legendre basis or the cosine basis, then for every bounded sequence \((x_n)\) of positive numbers such that \(m_n^{2\omega +1}x_n^2\to \infty\) as \(n\to \infty\) we have \[ (nx_n^2)^{-1}\log\mathbb{P}(T_{S2}\geq nx_n^2)\to 0,\;\text{as} n\to \infty. \] The proof of Theorem 1 is based on a version of the lower bound inequality given by \textit{A.A. Mogulskij} [Sib. Math. J. 37, No. 4, 782-787 (1996); translation from Sib. Math. Zh. 37, No. 4, 889-894 (1996; Zbl 0878.60022)].
    0 references
    0 references
    moderate deviation theorem
    0 references
    data-driven Neyman statistic
    0 references
    exponential inequality
    0 references