Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods

From MaRDI portal
Revision as of 07:20, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2576738

DOI10.1007/s10107-005-0631-2zbMath1124.90021OpenAlexW2139313686WikidataQ115385323 ScholiaQ115385323MaRDI QIDQ2576738

Jérôme Malick, Scott A. Miller

Publication date: 14 December 2005

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://hal.inria.fr/inria-00071403/file/MM05.pdf




Related Items (23)

Nonsmoothness in machine learning: specific structure, proximal identification, and applicationsA proximal method for composite minimizationEquilibrium problems on Riemannian manifolds with applicationsSensitivity Analysis for Mirror-Stratifiable Convex FunctionsLocal convergence properties of Douglas-Rachford and alternating direction method of multipliersKantorovich's theorem on Newton's method under majorant condition in Riemannian manifoldsThe degrees of freedom of partly smooth regularizersOn partial smoothness, tilt stability and the \({\mathcal {VU}}\)-decompositionNewton acceleration on manifolds identified by proximal gradient methodsHarnessing Structure in Composite Nonsmooth MinimizationThe \(\mathcal{U}\)-Lagrangian, fast track, and partial smoothness of a prox-regular functionConvergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraintsAccelerating inexact successive quadratic approximation for regularized optimization through manifold identificationActivity Identification and Local Linear Convergence of Forward--Backward-type MethodsFirst Order Methods for Optimization on Riemannian ManifoldsDini derivative and a characterization for Lipschitz and convex functions on Riemannian manifoldsA fast gradient and function sampling method for finite-max functionsSubgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvaturesLocal linear convergence analysis of Primal–Dual splitting methodsConvergence rates of forward-Douglas-Rachford splitting methodActive‐Set Newton Methods and Partial SmoothnessGeometrical interpretation of the predictor-corrector type algorithms in structured optimization problemsA \(\mathcal{VU}\)-algorithm for convex minimization



Cites Work


This page was built for publication: Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods