Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method
From MaRDI portal
Publication:5214191
zbMath1434.68430arXiv1803.10740MaRDI QIDQ5214191
Kim-Chuan Toh, Ziyan Luo, Defeng Sun, Nai-Hua Xiu
Publication date: 7 February 2020
Full work available at URL: https://arxiv.org/abs/1803.10740
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems, A semismooth Newton method for support vector classification and regression, An efficient augmented Lagrangian method for support vector machine, An efficient Hessian based algorithm for singly linearly and box constrained least squares regression, Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem, A Lagrange-Newton algorithm for sparse nonlinear programming, B-Subdifferentials of the Projection onto the Generalized Simplex, Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming
- SLOPE-adaptive variable selection via convex optimization
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Newton and quasi-Newton methods for normal maps with polyhedral sets
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- A nonsmooth version of Newton's method
- Pathwise coordinate optimization
- Sparse Modeling for Image and Vision Processing
- Hankel Matrix Rank Minimization with Applications to System Identification and Realization
- Constrained Statistical Inference
- On the Moreau--Yosida Regularization of the Vector $k$-Norm Related Functions
- Just relax: convex programming methods for identifying sparse signals in noise
- Some continuity properties of polyhedral multifunctions
- A Biometrics Invited Paper. The Analysis and Selection of Variables in Linear Regression
- Monotone Operators and the Proximal Point Algorithm
- Semismooth and Semiconvex Functions in Constrained Optimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Atomic Decomposition by Basis Pursuit
- Variational Analysis
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Gap Safe screening rules for sparsity enforcing penalties
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Proximité et dualité dans un espace hilbertien
- Convex Analysis
- The Isotonic Regression Problem and Its Dual
- Semismooth Matrix-Valued Functions
- Set-valued analysis