Improved variable selection with forward-lasso adaptive shrinkage
From MaRDI portal
Publication:542500
DOI10.1214/10-AOAS375zbMath1220.62089arXiv1104.3390OpenAlexW1972414162MaRDI QIDQ542500
Gareth M. James, Peter Radchenko
Publication date: 10 June 2011
Published in: The Annals of Applied Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1104.3390
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Generalized linear models (logistic models) (62J12)
Related Items
Interpretable dimension reduction for classifying functional data, Variable selection for survival data with a class of adaptive elastic net techniques, Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data, Joint estimation and variable selection for mean and dispersion in proper dispersion models, Principal minimax support vector machine for sufficient dimension reduction with contaminated data, Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure, In defense of LASSO, Variable selection in functional additive regression models, Quantile forward regression for high-dimensional survival data, Sparse least trimmed squares regression for analyzing high-dimensional large data sets, Long-term time series prediction using OP-ELM, Dealing with big data: comparing dimension reduction and shrinkage regression methods, Robust regression: an inferential method for determining which independent variables are most important, Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons, Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Relaxed Lasso
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Variable Inclusion and Shrinkage Algorithms
- A generalized Dantzig selector with shrinkage tuning
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A Statistical View of Some Chemometrics Regression Tools
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net