Autoregressive process modeling via the Lasso procedure
From MaRDI portal
Abstract: The Lasso is a popular model selection and estimation procedure for linear models that enjoys nice theoretical properties. In this paper, we study the Lasso estimator for fitting autoregressive time series models. We adopt a double asymptotic framework where the maximal lag may increase with the sample size. We derive theoretical results establishing various types of consistency. In particular, we derive conditions under which the Lasso estimator for the autoregressive coefficients is model selection consistent, estimation consistent and prediction consistent. Simulation study results are reported.
Recommendations
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 4102349 (Why is no real title available?)
- scientific article; zbMATH DE number 3723610 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A general class of exponential inequalities for martingales and ratios
- Aggregation for Gaussian regression
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Basic properties of strong mixing conditions. A survey and some open questions
- Data-Driven Efficient Estimation of the Spectral Density
- Greed is Good: Algorithmic Results for Sparse Approximation
- Just relax: convex programming methods for identifying sparse signals in noise
- Lasso-type recovery of sparse representations for high-dimensional data
- Least angle regression. (With discussion)
- Nonasymptotic bounds for autoregressive time series modeling.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Order selection for same-realization predictions in autoregressive processes
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sparse additive models
- Sparsity oracle inequalities for the Lasso
- The Adaptive Lasso and Its Oracle Properties
- The log-linear group-lasso estimator and its asymptotic properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(53)- Order selection for possibly infinite-order non-stationary time series
- Modified LASSO estimators for time series regression models with dependent disturbances
- Bayesian empirical likelihood inference and order shrinkage for autoregressive models
- Lasso with long memory regression errors
- Semiparametric estimation of INAR models using roughness penalization
- Model selection for time series with nonlinear trend
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Regularized bridge-type estimation with multiple penalties
- LASSO order selection for sparse autoregression: a bootstrap approach
- Uncertain Autoregressive Model via LASSO Procedure
- Matrix autoregressive models: generalization and Bayesian estimation
- On stochastic dynamic modeling of incidence data
- A note on the asymptotic distribution of lasso estimator for correlated data
- The LASSO method for bilinear time series models
- Simultaneous sparse model selection and coefficient estimation for heavy-tailed autoregressive processes
- CGMM LASSO-type estimator for the process of Ornstein-Uhlenbeck type
- Lasso estimation for spherical autoregressive processes
- Consistent and conservative model selection with the adaptive Lasso in stationary and nonstationary autoregressions
- Regularization for stationary multivariate time series
- Asymptotics of the adaptive elastic net estimation for conditional heteroscedastic time series models
- Lasso based variable selection of ARMA models
- Iteratively reweighted adaptive Lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes
- Banded regularization of autocovariance matrices in application to parameter estimation and forecasting of time series
- Regularization in dynamic random‐intercepts models for analysis of longitudinal data
- Lasso regression in sparse linear model with \(\varphi\)-mixing errors
- Time-varying Lasso
- Adaptive LASSO-type estimation for multivariate diffusion processes
- Integer-valued time series model order shrinkage and selection via penalized quasi-likelihood approach
- Order shrinkage and selection for the INGARCH(p,q) model
- Variable selection in quantile regression when the models have autoregressive errors
- Exponential squared loss based robust variable selection of AR models
- Variable selection for first‐order Poisson integer‐valued autoregressive model with covariables
- Lasso Inference for High-Dimensional Time Series
- Autoregressive models for matrix-valued time series
- Simultaneous statistical inference for second order parameters of time series under weak conditions
- Oracle inequalities for high dimensional vector autoregressions
- Lassoing the HAR model: a model selection perspective on realized volatility dynamics
- Lasso-type penalties for covariate selection and forecasting in time series
- Lasso-driven inference in time and space
- Generalized information criterion for the AR model
- A Bernstein-type inequality for high dimensional linear processes with applications to robust estimation of time series regressions
- \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors
- Performance bounds for parameter estimates of high-dimensional linear models with correlated errors
- Penalised inference for lagged dependent regression in the presence of autocorrelated residuals
- Subset selection for vector autoregressive processes using Lasso
- Oracle model selection for correlated data via residuals
- On the adaptive Lasso estimator of AR(\(p\)) time series with applications to INAR(\(p\)) and Hawkes processes
- The Doubly Adaptive LASSO for Vector Autoregressive Models
- High-dimensional regression with potential prior information on variable importance
- Adaptive LASSO estimation for ARDL models with GARCH innovations
- Poisson autoregressive process modeling via the penalized conditional maximum likelihood procedure
- Tuning parameter selection for the adaptive LASSO in the autoregressive model
- Estimation of stationary autoregressive models with the Bayesian LASSO
This page was built for publication: Autoregressive process modeling via the Lasso procedure
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q631620)