A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
From MaRDI portal
Publication:4717559
DOI10.1137/S1052623494278839zbMath0868.90109MaRDI QIDQ4717559
Publication date: 1 December 1996
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
semismoothness; Moreau-Yosida regularization; nondifferentiable convex minimization; \(Q\)-superlinear rate of convergence
Related Items
Convergence analysis of some methods for minimizing a nonsmooth convex function, Variable metric bundle methods: From conceptual to implementable forms, A preconditioning proximal Newton method for nondifferentiable convex optimization