On the use of cubic spline smoothing for testing parametric linear regression models (Q1965955)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On the use of cubic spline smoothing for testing parametric linear regression models |
scientific article |
Statements
On the use of cubic spline smoothing for testing parametric linear regression models (English)
0 references
2 March 2000
0 references
The fixed design regression model \(y_i=f(x_i)+\varepsilon_i\), \(x_i\in[0,1],\) \(i=1,\dots,n,\) is considered, where \(\varepsilon_i\) are i.i.d. \({\mathcal N}(0,\sigma^2)\) and \(f(x)\) is an unknown function from the Sobolev space \(W_2^{(2)}[0,1]\). Let \(G\) be a linear finite-dimensional subspace of \(W_2^{(2)}[0,1]\). The author considers tests for the null hypothesis \(f\in G\) against the sequence of alternatives \(f=f_0+f_1,\) where \(f_0\in G\), \(f_1=\gamma_n g,\) for some \(g\) orthogonal to \(G\). The cubic smoothing spline fit \(\hat r\) is used in all these tests. It is defined as the function \(r\) which minimizes the functional \[ n^{-1}\sum_{i=1}^n(y_i-r(x_i))^2+\lambda\int_0^1 r^{(2)}(x)^2dx, \] where \(\lambda\to 0\) as \(n\to\infty\). Let \(S\) be the spline smoother matrix, i.e. the matrix which satisfies \((\hat r(x_1),\dots,\hat r(x_n))'=Sy\), \(y=(y_1,\dots,y_n)'\). Let \(P_G\) be the projection matrix from \(R^n\) to the subspace \(\{(h(x_1),\dots,h(x_n))\colon\;h\in G\}\). The considered tests are based on the statistics \(T_k=y'M_k y\), where \(M_1=(I-P_G)'S'S(I-P_G)\) (Eubank and Spiegelman statistic); \(M_2=(I-P_G)'(I-P_G)-(I-S)'(I-S)\) (pseudo likelihood ratio statistic); and \(M_3=(I-P_G)'(I-P_G)-(I-P_G)'(I-S)'(I-S)(I-P_G)\) (Azzalini and Bowman statistic). The following theorem is proved. If \(n\to\infty\), \(\lambda\to 0\), \(\gamma_n=(n\lambda^{1/8})^{-1/2}\to 0\), then for k=1,2 \[ (T_k-\sigma^{2}tr(M_k))\left(\sigma^2\sqrt{2tr(M_k^2)} \right)^{-1}\to{\mathcal N}\left(\|g\|^2/\sigma^2 C_k,1\right). \] In the case k=3 some additional conditions are required for this convergence. The constants \(C_k\) are computed. With the help of this theorem the author computes the Pitman asymptotic relative efficiency (ARE) of the tests and finds that the second and the third tests are asymptotically equivalent and more efficient than the first test. Some numerical examples of simulated data are presented.
0 references
spline smoothing
0 references
lack of fit tests
0 references
asymptotic relative efficiency
0 references