An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
From MaRDI portal
Publication:2288192
DOI10.1007/S10107-018-1329-6zbMath1435.90112arXiv1712.05910OpenAlexW2963105041MaRDI QIDQ2288192
Yangjing Zhang, Ning Zhang, Kim-Chuan Toh, Defeng Sun
Publication date: 17 January 2020
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.05910
Linear regression; mixed models (62J05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items (24)
Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method ⋮ Unnamed Item ⋮ A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer ⋮ An investigation on semismooth Newton based augmented Lagrangian method for image restoration ⋮ Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems ⋮ A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems ⋮ Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models ⋮ Linearly-convergent FISTA variant for composite optimization with duality ⋮ A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems ⋮ Unnamed Item ⋮ Newton-type methods with the proximal gradient step for sparse estimation ⋮ A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems ⋮ An efficient augmented Lagrangian method with semismooth Newton solver for total generalized variation ⋮ An Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear Programming ⋮ The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion ⋮ Iteratively Reweighted Group Lasso Based on Log-Composite Regularization ⋮ A semismooth Newton stochastic proximal point algorithm with variance reduction ⋮ A Corrected Inexact Proximal Augmented Lagrangian Method with a Relative Error Criterion for a Class of Group-Quadratic Regularized Optimal Transport Problems ⋮ A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems ⋮ An efficient Hessian based algorithm for singly linearly and box constrained least squares regression ⋮ Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem ⋮ An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems ⋮ A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems ⋮ Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming
- Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer
- An extension of Luque's growth condition
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Complementarity functions and numerical experiments on some smoothing Newton methods for second-order-cone complementarity problems
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- A nonsmooth version of Newton's method
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- Hankel Matrix Rank Minimization with Applications to System Identification and Realization
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Monotone Operators and the Proximal Point Algorithm
- Semismooth and Semiconvex Functions in Constrained Optimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Gap Safe screening rules for sparsity enforcing penalties
- Sparsity and Smoothness Via the Fused Lasso
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Robust Recovery of Signals From a Structured Union of Subspaces
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Model Selection and Estimation in Regression with Grouped Variables
- Semismooth Matrix-Valued Functions
This page was built for publication: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems