Leave-one-out cross-validation is risk consistent for Lasso
From MaRDI portal
Publication:2512895
DOI10.1007/s10994-014-5438-zzbMath1320.62172arXiv1206.6128OpenAlexW2088526229MaRDI QIDQ2512895
Darren Homrighausen, Daniel J. McDonald
Publication date: 2 February 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.6128
Related Items (6)
On cross-validated Lasso in high dimensions ⋮ The leave-worst-\(k\)-out criterion for cross validation ⋮ Cross validation in LASSO and its acceleration ⋮ Consistent parameter estimation for Lasso and approximate message passing ⋮ On the sensitivity of the Lasso to the number of predictor variables ⋮ Semi-analytic approximate stability selection for correlated data in generalized linear models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Strong consistency of Lasso estimators
- Degrees of freedom in lasso problems
- LASSO-pattern search algorithm with application to ophthalmology and genomic data
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- A distribution-free theory of nonparametric regression
- Asymptotics for Lasso-type estimators.
- Least angle regression. (With discussion)
- The Lasso problem and uniqueness
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- On the ``degrees of freedom of the lasso
- High-dimensional graphs and variable selection with the Lasso
- Unified LASSO Estimation by Least Squares Approximation
- Uniform Convergence in Probability and Stochastic Equicontinuity
- Atomic Decomposition by Basis Pursuit
- Stochastic Limit Theory
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- 10.1162/153244302760200704
- Linear Model Selection by Cross-Validation
- The Lasso, correlated design, and improved oracle inequalities
This page was built for publication: Leave-one-out cross-validation is risk consistent for Lasso