Model selection for high-dimensional linear regression with dependent observations (Q2215720)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Model selection for high-dimensional linear regression with dependent observations |
scientific article |
Statements
Model selection for high-dimensional linear regression with dependent observations (English)
0 references
14 December 2020
0 references
The article concentrates on high-dimensional regression models of the form \(y_t=\sum_{j=1}^{p} {\beta_j}x_{tj}+\epsilon _t\), \(t=1,\dots,n\), where \(n\) is the sample size, \(x_{t1},\dots,x_{tp}\) are predictor variables and \(\epsilon_t\) are disturbance terms. The aim is to develop an efficient model selection procedure. The author assumes that (\({x_t},{\epsilon _t}\)) is a zero-mean stationary time series, \(E(x_t, \epsilon _t)=0\) and as model selection procedure one considers the couple consisting of the orthogonal greedy algorithm (OGA) and the high-dimensional Akaike information criterion (HDAIC). Under a set of quite special assumptions on coefficients and parameters and some sparsity conditions, rates of convergence of the OGA and rates of convergence of OGA+HDAIC are obtained. Results are compared to the existing ones in the literature. Additional technical results are shown in Appendices A and B. There it is reported that simulation results to illustrate the performance of OGA+HDAIC together with some more technical details are to be found in a supplementary material: Supplement to ``Model selection for high-dimensional linear regression with dependent observations'', \url{doi:10.1214/19-AOS1872SUPP}.
0 references
best \(m\)-term approximations
0 references
high-dimensional Akaike's information criterion
0 references
orthogonal greedy algorithm
0 references
sparsity conditions
0 references
time series
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references