Tuning parameter selection for the adaptive LASSO in the autoregressive model
From MaRDI portal
Publication:526980
DOI10.1016/j.jkss.2016.10.005zbMath1485.62123OpenAlexW2554472417MaRDI QIDQ526980
Okyoung Na, Sangin Lee, Sunghoon Kwon
Publication date: 15 May 2017
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2016.10.005
subset selectionBayesian information criterionautoregressive modelleast absolute shrinkage and selection operatorpenalized estimation
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items
Bayesian empirical likelihood inference and order shrinkage for autoregressive models ⋮ Penalized multiply robust estimation in high-order autoregressive processes with missing explanatory variables
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Estimation in high-dimensional linear models with deterministic design matrices
- Autoregressive process modeling via the Lasso procedure
- Order selection in nonstationary autoregressive models
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- The estimation of the order of an ARMA process
- Estimating the dimension of a model
- Bootstraps for time series
- Nonconcave penalized likelihood with a diverging number of parameters.
- On Bernstein-type inequalities for martingales.
- Calibrating nonconvex penalized regression in ultra-high dimension
- Pathwise coordinate optimization
- Fitting autoregressive models for prediction
- Large sample properties of the smoothly clipped absolute deviation penalized maximum likelihood estimation on high dimensions
- Global optimality of nonconvex penalized estimators
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- PREDICTION‐FOCUSED MODEL SELECTION FOR AUTOREGRESSIVE MODELS
- Selection of the order of an autoregressive model by Akaike's information criterion
- A cross-validatory method for dependent data
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- On a method of identification of best subset model from full ar-model
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Estimation of stationary autoregressive models with the Bayesian LASSO
- Smoothly Clipped Absolute Deviation on High Dimensions
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: Tuning parameter selection for the adaptive LASSO in the autoregressive model