Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria
DOI10.1080/00949655.2012.755532zbMath1453.62583OpenAlexW1985992733MaRDI QIDQ5220012
Sadanori Konishi, Heewon Park, Fumitake Sakaori
Publication date: 9 March 2020
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949655.2012.755532
regression modelelastic netleast-trimmed squaresefficient bootstrap information criteriaLasso-type regularization
Ridge regression; shrinkage estimators (Lasso) (62J07) Robustness and adaptive procedures (parametric inference) (62F35) Bootstrap, jackknife and other resampling methods (62F40)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- High breakdown estimators for principal components: the projection-pursuit approach revis\-ited
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Information criteria and statistical modeling.
- Generalised information criteria in model selection
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Robust Nonparametric Regression via Sparsity Control With Application to Load Curve Data Cleansing
- Stability Selection
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context
This page was built for publication: Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria