A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems

From MaRDI portal
Publication:4606653

DOI10.1137/16M1097572zbMath1392.65062arXiv1607.05428MaRDI QIDQ4606653

Kim-Chuan Toh, Defeng Sun, Xudong Li

Publication date: 9 March 2018

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1607.05428



Related Items

A Trust-region Method for Nonsmooth Nonconvex Optimization, An efficient semismooth Newton method for adaptive sparse signal recovery problems, An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, Sparse Approximations with Interior Point Methods, Composite Difference-Max Programs for Modern Statistical Estimation Problems, An investigation on semismooth Newton based augmented Lagrangian method for image restoration, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, Efficient projection onto the intersection of a half-space and a box-like set and its generalized Jacobian, On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems, Calibrated zero-norm regularized LS estimator for high-dimensional error-in-variables regression, An Iterative Reduction FISTA Algorithm for Large-Scale LASSO, Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems, A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems, A semismooth Newton method for support vector classification and regression, An inexact interior-point Lagrangian decomposition algorithm with inexact oracles, tSSNALM: a fast two-stage semi-smooth Newton augmented Lagrangian method for sparse CCA, Transformed primal-dual methods for nonlinear saddle point systems, An efficient semi-proximal ADMM algorithm for low-rank and sparse regularized matrix minimization problems with real-world applications, Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials, A dual active set method for \(\ell1\)-regularized problem, A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds, A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Unnamed Item, Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, On proximal augmented Lagrangian based decomposition methods for dual block-angular convex composite programming problems, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems, Local convergence analysis of augmented Lagrangian method for nonlinear semidefinite programming, An efficient augmented Lagrangian method with semismooth Newton solver for total generalized variation, An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints, An efficient augmented Lagrangian method for support vector machine, Convergence of the augmented decomposition algorithm, The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion, Iteratively Reweighted FGMRES and FLSQR for Sparse Reconstruction, Double fused Lasso penalized LAD for matrix regression, A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope, A unified primal dual active set algorithm for nonconvex sparse recovery, A linearly convergent majorized ADMM with indefinite proximal terms for convex composite programming and its applications, Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold, An efficient Hessian based algorithm for singly linearly and box constrained least squares regression, An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems, Spectral Operators of Matrices: Semismoothness and Characterizations of the Generalized Jacobian, On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming, Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem, Smoothing Newton method for \(\ell^0\)-\(\ell^2\) regularized linear inverse problem, An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems, Unnamed Item, Unified convergence analysis of a second-order method of multipliers for nonlinear conic programming, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, High-performance statistical computing in the computing environments of the 2020s, A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems, An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector, An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint


Uses Software


Cites Work