Simple expressions of the Lasso and SLOPE estimators in low-dimension
From MaRDI portal
Publication:5222210
DOI10.1080/02331888.2020.1720019zbMATH Open1435.62275OpenAlexW2924077917MaRDI QIDQ5222210FDOQ5222210
Authors: Patrick J. C. Tardivel, R. Servien, D. Concordet
Publication date: 1 April 2020
Published in: Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331888.2020.1720019
Recommendations
- Slope meets Lasso: improved oracle bounds and optimality
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Does SLOPE outperform bridge regression?
- SLOPE-adaptive variable selection via convex optimization
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Title not available (Why is that?)
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- A significance test for the lasso
- Controlling the false discovery rate via knockoffs
- Sequential selection procedures and false discovery rate control
- SLOPE-adaptive variable selection via convex optimization
- Familywise error rate control via knockoffs
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
- On Sparse Vector Recovery Performance in Structurally Orthogonal Matrices via LASSO
- Selective inference with unknown variance via the square-root Lasso
- On Lasso refitting strategies
- On the sign recovery by least absolute shrinkage and selection operator, thresholded least absolute shrinkage and selection operator, and thresholded basis pursuit denoising
Cited In (5)
- Slope meets Lasso: improved oracle bounds and optimality
- Pattern recovery and signal denoising by SLOPE when the design matrix is orthogonal
- Proximal operator for the sorted \(\ell_1\) norm: application to testing procedures based on SLOPE
- Does SLOPE outperform bridge regression?
- SLASSO: a scaled LASSO for multicollinear situations
Uses Software
This page was built for publication: Simple expressions of the Lasso and SLOPE estimators in low-dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5222210)