Cited in
(50)- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression
- Influence measures and stability for graphical models
- Variable selection for sparse Dirichlet-multinomial regression with an application to microbiome data analysis
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Stability Selection
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Stabilizing the Lasso against cross-validation variability
- From simple structure to sparse components: a review
- Inferring large graphs using \(\ell_1\)-penalized likelihood
- adass
- Estimation and variable selection with exponential weights
- Prediction error bounds for linear regression with the TREX
- Aggregated hold out for sparse linear regression with a robust loss function
- Stability
- Nearly unbiased variable selection under minimax concave penalty
- Maximin effects in inhomogeneous large-scale data
- Stable feature selection for biomarker discovery
- Stability of feature selection in classification issues for high-dimensional correlated data
- LassoNet
- AdaptFitOS
- ScreenClean
- FAMT
- S+WAVELETS
- WWGbook
- CAPUSHE
- SparseFIS
- spinyReg
- PBoostGA
- BALD
- mothur
- GADAG
- scalreg
- EEBoost
- relaxo
- AMPR
- RandGA
- dSTEM
- palasso
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- Additive model selection
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- TIGER: A tuning-insensitive approach for optimally estimating Gaussian graphical models
- Sparse principal component analysis via fractional function regularity
- Semi-analytic resampling in Lasso
- Randomized maximum-contrast selection: subagging for large-scale regression
- Entropy-randomized projection
- A novel bagging approach for variable ranking and selection via a mixed importance measure
- Self-concordant analysis for logistic regression
- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
This page was built for software: Bolasso