Lasso, iterative feature selection and the correlation selector: oracle inequalities and numerical performances
From MaRDI portal
Publication:1951793
DOI10.1214/08-EJS288zbMath1320.62084arXiv0710.4466MaRDI QIDQ1951793
Publication date: 24 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0710.4466
confidence regions; regression estimation; Lasso; statistical learning; shrinkage and thresholding methods
62G08: Nonparametric regression and quantile regression
62J07: Ridge regression; shrinkage estimators (Lasso)
62G15: Nonparametric tolerance and confidence regions
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Transductive versions of the Lasso and the Dantzig selector, Generalization of constraints for high dimensional regression problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Iterative feature selection in least square regression estimation
- Some theoretical results on the grouped variables Lasso
- Symmetrization approach to concentration inequalities for empirical processes.
- Least angle regression. (With discussion)
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Approximation and learning by greedy algorithms
- A Statistical View of Some Chemometrics Regression Tools
- Density estimation with quadratic loss: a confidence intervals method
- Sparse Density Estimation with ℓ1 Penalties
- Model Selection and Estimation in Regression with Grouped Variables