On the sensitivity of the Lasso to the number of predictor variables
From MaRDI portal
Publication:1790389
DOI10.1214/16-STS586zbMath1442.62160arXiv1403.4544MaRDI QIDQ1790389
Jeffrey S. Simonoff, Cheryl J. Flynn, Clifford M. Hurvich
Publication date: 2 October 2018
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1403.4544
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Best subset selection via a modern optimization lens
- A lasso for hierarchical interactions
- Statistical significance in high-dimensional linear models
- A new perspective on least squares under convex constraint
- Statistics for high-dimensional data. Methods, theory and applications.
- Near-ideal model selection by \(\ell _{1}\) minimization
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Relaxed Lasso
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Estimating the dimension of a model
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Least angle regression. (With discussion)
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Aggregation for Gaussian regression
- On the ``degrees of freedom of the lasso
- Leave-one-out cross-validation is risk consistent for Lasso
- Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models
- Regression and time series model selection in small samples
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A Survey of L1 Regression
- A Model-Averaging Approach for High-Dimensional Regression
- VIF Regression: A Fast Regression Algorithm for Large Data
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares