Regularization and variable selection for infinite variance autoregressive models
From MaRDI portal
Publication:447619
Recommendations
- Efficient estimation and variable selection for infinite variance autoregressive models
- Model selection for infinite variance time series
- Self-Weighted Least Absolute Deviation Estimation for Infinite Variance Autoregressive Models
- Simultaneous sparse model selection and coefficient estimation for heavy-tailed autoregressive processes
- Least tail-trimmed absolute deviation estimation for autoregressions with infinite/finite variance
Cites work
- scientific article; zbMATH DE number 4102349 (Why is no real title available?)
- scientific article; zbMATH DE number 193577 (Why is no real title available?)
- scientific article; zbMATH DE number 1219611 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 4186934 (Why is no real title available?)
- "Infinite Variance" and Research Strategy in Time Series Analysis
- A simple general approach to inference about the tail of a distribution
- Adaptive Lasso for sparse high-dimensional regression models
- Asymptotics for Lasso-type estimators.
- COBS: qualitatively constrained smoothing via linear programming
- Consistency of Araike's information criterion for infinite variance autoregressive processes
- Dealing with the multiplicity of solutions of the \(\ell _{1}\) and \(\ell _{\infty }\) regression models
- Estimating the dimension of a model
- Heavy tail modeling and teletraffic data. (With discussions and rejoinder)
- M-estimation for autoregression with infinite variance
- On the asymptotics of constrained \(M\)-estimation
- One-step sparse estimates in nonconcave penalized likelihood models
- Quantile smoothing splines
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Regularity and minimality of infinite variance processes
- Self-Weighted Least Absolute Deviation Estimation for Infinite Variance Autoregressive Models
- The Adaptive Lasso and Its Oracle Properties
- Wold decomposition, prediction and parameterization of stationary processes with infinite variance
Cited in
(6)- Simultaneous sparse model selection and coefficient estimation for heavy-tailed autoregressive processes
- Model selection for infinite variance time series
- Efficient maximum approximated likelihood inference for Tukey's \(g\)-and-\(h\) distribution
- Efficient estimation and variable selection for infinite variance autoregressive models
- Adaptive Lasso for linear regression models with ARMA-GARCH errors
- Exponential squared loss based robust variable selection of AR models
This page was built for publication: Regularization and variable selection for infinite variance autoregressive models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q447619)