Hybrid safe-strong rules for efficient optimization in Lasso-type problems
From MaRDI portal
Publication:830584
DOI10.1016/J.CSDA.2020.107063OpenAlexW3065811904MaRDI QIDQ830584FDOQ830584
Authors: Tianbao Yang, Patrick Breheny, YaoHui Zeng
Publication date: 7 May 2021
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1704.08742
Recommendations
- Lasso screening rules via dual polytope projection
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- A safe reinforced feature screening strategy for Lasso based on feasible solutions
- Gap safe screening rules for sparsity enforcing penalties
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
Cites Work
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- Title not available (Why is that?)
- Title not available (Why is that?)
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- The Lasso problem and uniqueness
- Title not available (Why is that?)
- Efficient block-coordinate descent algorithms for the group Lasso
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Lasso screening rules via dual polytope projection
Cited In (4)
Uses Software
This page was built for publication: Hybrid safe-strong rules for efficient optimization in Lasso-type problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q830584)