Pruning variable selection ensembles
From MaRDI portal
Publication:4970243
DOI10.1002/SAM.11410OpenAlexW2964276935WikidataQ128275233 ScholiaQ128275233MaRDI QIDQ4970243FDOQ4970243
Authors: Chunxia Zhang, Yilei Wu, Mu Zhu
Publication date: 14 October 2020
Published in: Statistical Analysis and Data Mining: The ASA Data Science Journal (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1704.08265
Cites Work
- PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Heuristics of instability and stabilization in model selection
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- Title not available (Why is that?)
- Random forests
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Random lasso
- Bayesian variable selection with shrinking and diffusing priors
- Title not available (Why is that?)
- Variable Selection with Error Control: Another Look at Stability Selection
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Combining pattern classifiers. Methods and algorithms
- Ensembling neural networks: Many could be better than all
- Boosting. Foundations and algorithms.
- Ensemble approaches for regression: a survey
- On stability issues in deriving multivariable regression models
- Two tales of variable selection for high dimensional regression: Screening and model building
- Variable selection by ensembles for the Cox model
- Extensions of stability selection using subsamples of observations and covariates
- Stabilizing the Lasso against cross-validation variability
- Accurate ensemble pruning with PL-bagging
- Subsampling versus bootstrapping in resampling-based model selection for multivariable regression
- Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates
- Stable prediction in high-dimensional linear models
- Toward an objective and reproducible model choice via variable selection deviation
Cited In (3)
Uses Software
This page was built for publication: Pruning variable selection ensembles
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4970243)