Two tales of variable selection for high dimensional regression: Screening and model building
From MaRDI portal
Publication:4969932
DOI10.1002/sam.11219OpenAlexW1483870728MaRDI QIDQ4969932
Yoonkyung Lee, Cong Liu, Tao Shi
Publication date: 14 October 2020
Published in: Statistical Analysis and Data Mining: The ASA Data Science Journal (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/sam.11219
Related Items
PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection ⋮ Pruning variable selection ensembles
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the adaptive elastic net with a diverging number of parameters
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Forward Regression for Ultra-High Dimensional Variable Screening
- Better Subset Regression Using the Nonnegative Garrote
- Extended Bayesian information criteria for model selection with large model spaces
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Regularization and Variable Selection Via the Elastic Net
- On the Non-Negative Garrotte Estimator