An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems

From MaRDI portal
Revision as of 13:01, 2 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2288192

DOI10.1007/S10107-018-1329-6zbMath1435.90112arXiv1712.05910OpenAlexW2963105041MaRDI QIDQ2288192

Yangjing Zhang, Ning Zhang, Kim-Chuan Toh, Defeng Sun

Publication date: 17 January 2020

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1712.05910




Related Items (24)

Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian MethodUnnamed ItemA Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing RegularizerAn investigation on semismooth Newton based augmented Lagrangian method for image restorationDifference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization ProblemsA dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problemsGroup linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear modelsLinearly-convergent FISTA variant for composite optimization with dualityA dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problemsUnnamed ItemNewton-type methods with the proximal gradient step for sparse estimationA Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso ProblemsAn efficient augmented Lagrangian method with semismooth Newton solver for total generalized variationAn Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear ProgrammingThe Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error CriterionIteratively Reweighted Group Lasso Based on Log-Composite RegularizationA semismooth Newton stochastic proximal point algorithm with variance reductionA Corrected Inexact Proximal Augmented Lagrangian Method with a Relative Error Criterion for a Class of Group-Quadratic Regularized Optimal Transport ProblemsA dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problemsAn efficient Hessian based algorithm for singly linearly and box constrained least squares regressionEfficient Sparse Semismooth Newton Methods for the Clustered Lasso ProblemAn Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso ProblemsA semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problemsEfficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector


Uses Software



Cites Work




This page was built for publication: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems