Penalised inference for lagged dependent regression in the presence of autocorrelated residuals
From MaRDI portal
(Redirected from Publication:1640650)
Abstract: Linear models that contain a time-dependent response and explanatory variables have attracted much interest in recent years. The most general form of the existing approaches is of a linear regression model with autoregressive moving average residuals. The addition of the moving average component results in a complex model with a very challenging implementation. In this paper, we propose to account for the time dependency in the data by explicitly adding autoregressive terms of the response variable in the linear model. In addition, we consider an autoregressive process for the errors in order to capture complex dynamic relationships parsimoniously. To broaden the application of the model, we present an penalized likelihood approach for the estimation of the parameters and show how the adaptive lasso penalties lead to an estimator which enjoys the oracle property. Furthermore, we prove the consistency of the estimators with respect to the mean squared prediction error in high-dimensional settings, an aspect that has not been considered by the existing time-dependent regression models. A simulation study and real data analysis show the successful applications of the model on financial data on stock indexes.
Recommendations
- Penalized regression models with autoregressive error terms
- Nonconcave penalized estimation in sparse vector autoregression model
- SCAD-penalized regression for varying-coefficient models with autoregressive errors
- Penalized generalized empirical likelihood in high-dimensional weakly dependent data
- SIGNIFICANT VARIABLE SELECTION AND AUTOREGRESSIVE ORDER DETERMINATION FOR TIME‐SERIES PARTIALLY LINEAR MODELS
Cites work
- scientific article; zbMATH DE number 1573291 (Why is no real title available?)
- scientific article; zbMATH DE number 3997615 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A direct estimation of high dimensional stationary vector autoregressions
- Adaptive Lasso for linear regression models with ARMA-GARCH errors
- Adaptive Lasso for sparse high-dimensional regression models
- Akaike's information criterion in generalized estimating equations
- Asymptotics for Lasso-type estimators.
- Autoregressive process modeling via the Lasso procedure
- Cube root asymptotics
- Distribution of Residual Autocorrelations in Autoregressive-Integrated Moving Average Time Series Models
- Extended Bayesian information criteria for model selection with large model spaces
- High-dimensional graphs and variable selection with the Lasso
- Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Regularized estimation in sparse high-dimensional time series models
- Shrinkage estimation for linear regression with ARMA errors
- Testing the null hypothesis of stationarity against the alternative of a unit root. How sure are we that economic time series have a unit root?
- The Adaptive Lasso and Its Oracle Properties
- The Bayesian Lasso
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(3)
This page was built for publication: Penalised inference for lagged dependent regression in the presence of autocorrelated residuals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1640650)