Improved variable selection with forward-lasso adaptive shrinkage
From MaRDI portal
Publication:542500
DOI10.1214/10-AOAS375zbMATH Open1220.62089arXiv1104.3390OpenAlexW1972414162MaRDI QIDQ542500FDOQ542500
Authors: Peter Radchenko, Gareth M. James
Publication date: 10 June 2011
Published in: The Annals of Applied Statistics (Search for Journal in Brave)
Abstract: Recently, considerable interest has focused on variable selection methods in regression situations where the number of predictors, , is large relative to the number of observations, . Two commonly applied variable selection approaches are the Lasso, which computes highly shrunk regression coefficients, and Forward Selection, which uses no shrinkage. We propose a new approach, "Forward-Lasso Adaptive SHrinkage" (FLASH), which includes the Lasso and Forward Selection as special cases, and can be used in both the linear regression and the Generalized Linear Model domains. As with the Lasso and Forward Selection, FLASH iteratively adds one variable to the model in a hierarchical fashion but, unlike these methods, at each step adjusts the level of shrinkage so as to optimize the selection of the next variable. We first present FLASH in the linear regression setting and show that it can be fitted using a variant of the computationally efficient LARS algorithm. Then, we extend FLASH to the GLM domain and demonstrate, through numerous simulations and real world data sets, as well as some theoretical analysis, that FLASH generally outperforms many competing approaches.
Full work available at URL: https://arxiv.org/abs/1104.3390
Recommendations
Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Title not available (Why is that?)
- A generalized Dantzig selector with shrinkage tuning
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Adaptive Lasso for sparse high-dimensional regression models
- Relaxed Lasso
- A Statistical View of Some Chemometrics Regression Tools
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Variable inclusion and shrinkage algorithms
Cited In (27)
- Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and autometrics
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
- The predictive Lasso
- Fast FSR Variable Selection with Applications to Clinical Trials
- Interpretable dimension reduction for classifying functional data
- Dealing with big data: comparing dimension reduction and shrinkage regression methods
- Variable selection in functional additive regression models
- Forward-backward selection with early dropping
- Principal minimax support vector machine for sufficient dimension reduction with contaminated data
- Variable selection for survival data with a class of adaptive elastic net techniques
- Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Joint estimation and variable selection for mean and dispersion in proper dispersion models
- Variable inclusion and shrinkage algorithms
- Robust regression: an inferential method for determining which independent variables are most important
- Fast forward selection for generalized estimating equations with a large number of predictor variables
- Variable Selection and Shrinkage: Comparison of Some Approaches
- Decomposition feature selection with applications in detecting correlated biomarkers of bipolar disorders
- Long-term time series prediction using OP-ELM
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Quantile forward regression for high-dimensional survival data
- Lookahead and piloting strategies for variable selection
- Regularization for electricity price forecasting
- Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
- Variable selection via RIVAL (removing irrelevant variables amidst lasso iterations) and its application to nuclear material detection
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- In defense of LASSO
Uses Software
This page was built for publication: Improved variable selection with forward-lasso adaptive shrinkage
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q542500)