Variable screening for high dimensional time series
From MaRDI portal
(Redirected from Publication:1746535)
Abstract: Variable selection is a widely studied problem in high dimensional statistics, primarily since estimating the precise relationship between the covariates and the response is of great importance in many scientific disciplines. However, most of theory and methods developed towards this goal for the linear model invoke the assumption of iid sub-Gaussian covariates and errors. This paper analyzes the theoretical properties of Sure Independence Screening (SIS) (Fan and Lv [J. R. Stat. Soc. Ser. B Stat. Methodol. 70 (2008) 849-911]) for high dimensional linear models with dependent and/or heavy tailed covariates and errors. We also introduce a generalized least squares screening (GLSS) procedure which utilizes the serial correlation present in the data. By utilizing this serial correlation when estimating our marginal effects, GLSS is shown to outperform SIS in many cases. For both procedures we prove sure screening properties, which depend on the moment conditions, and the strength of dependence in the error and covariate processes, amongst other factors. Additionally, combining these screening procedures with the adaptive Lasso is analyzed. Dependence is quantified by functional dependence measures (Wu [Proc. Natl. Acad. Sci. USA 102 (2005) 14150-14154]), and the results rely on the use of Nagaev-type and exponential inequalities for dependent random variables. We also conduct simulations to demonstrate the finite sample performance of these procedures, and include a real data application of forecasting the US inflation rate.
Recommendations
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sure independence screening in generalized linear models with NP-dimensionality
- Greedy forward regression for variable screening
- Some notes on robust sure independence screening
- Penalized linear regression with high-dimensional pairwise screening
Cites work
- scientific article; zbMATH DE number 1735137 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 2199188 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Adaptive Lasso for sparse high-dimensional regression models
- An overview of recent developments in genomics and associated statistical methods
- Asymptotic spectral theory for nonlinear time series
- Asymptotic theory for stationary processes
- Banding sample autocovariance matrices of stationary processes
- Covariance and precision matrix estimation for high-dimensional time series
- Covariance matrix estimation for stationary time series
- Feature screening via distance correlation learning
- Generalized Least Squares with an Estimated Autocovariance Matrix
- Generalized least squares with misspecified serial correlation structures
- Independent Screening for Single-Index Hazard rate Models with Ultrahigh Dimensional Features
- Long Range Dependence
- Marginal empirical likelihood and sure independence feature screening
- Martingale difference correlation and its use in high-dimensional variable screening
- Mixing: Properties and examples
- Model-free feature screening for ultrahigh-dimensional data
- Model-free sure screening via maximum correlation
- Moving-average representation of autoregressive approximations
- Non-strong mixing autoregressive processes
- Nonlinear system theory: Another look at dependence
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models
- Nonparametric independence screening and structure identification for ultra-high dimensional longitudinal data
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- On linear processes with dependent innovations
- Oracle inequalities for high dimensional vector autoregressions
- Performance bounds for parameter estimates of high-dimensional linear models with correlated errors
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Regularized estimation in sparse high-dimensional time series models
- Robust rank correlation based screening
- Shrinkage tuning parameter selection with a diverging number of parameters
- Simultaneous analysis of Lasso and Dantzig selector
- Statistical challenges of high-dimensional data
- Statistics for high-dimensional data. Methods, theory and applications.
- Stochastic Limit Theory
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sure independence screening in generalized linear models with NP-dimensionality
- The Adaptive Lasso and Its Oracle Properties
- Time series: theory and methods.
- Tuning parameter selection in high dimensional penalized likelihood
- Ultrahigh dimensional time course feature selection
- Variable selection for sparse high-dimensional nonlinear regression models by combining nonnegative garrote and sure independence screening
- \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors
Cited in
(5)- Covariate assisted screening and estimation
- Determining the number of factors for high-dimensional time series
- Boosting high dimensional predictive regressions with time varying parameters
- Sequential monitoring of high‐dimensional time series
- Targeting Predictors Via Partial Distance Correlation With Applications to Financial Forecasting
This page was built for publication: Variable screening for high dimensional time series
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1746535)