Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
From MaRDI portal
Publication:2576738
DOI10.1007/s10107-005-0631-2zbMath1124.90021OpenAlexW2139313686WikidataQ115385323 ScholiaQ115385323MaRDI QIDQ2576738
Jérôme Malick, Scott A. Miller
Publication date: 14 December 2005
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://hal.inria.fr/inria-00071403/file/MM05.pdf
Related Items (23)
Nonsmoothness in machine learning: specific structure, proximal identification, and applications ⋮ A proximal method for composite minimization ⋮ Equilibrium problems on Riemannian manifolds with applications ⋮ Sensitivity Analysis for Mirror-Stratifiable Convex Functions ⋮ Local convergence properties of Douglas-Rachford and alternating direction method of multipliers ⋮ Kantorovich's theorem on Newton's method under majorant condition in Riemannian manifolds ⋮ The degrees of freedom of partly smooth regularizers ⋮ On partial smoothness, tilt stability and the \({\mathcal {VU}}\)-decomposition ⋮ Newton acceleration on manifolds identified by proximal gradient methods ⋮ Harnessing Structure in Composite Nonsmooth Minimization ⋮ The \(\mathcal{U}\)-Lagrangian, fast track, and partial smoothness of a prox-regular function ⋮ Convergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraints ⋮ Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification ⋮ Activity Identification and Local Linear Convergence of Forward--Backward-type Methods ⋮ First Order Methods for Optimization on Riemannian Manifolds ⋮ Dini derivative and a characterization for Lipschitz and convex functions on Riemannian manifolds ⋮ A fast gradient and function sampling method for finite-max functions ⋮ Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures ⋮ Local linear convergence analysis of Primal–Dual splitting methods ⋮ Convergence rates of forward-Douglas-Rachford splitting method ⋮ Active‐Set Newton Methods and Partial Smoothness ⋮ Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems ⋮ A \(\mathcal{VU}\)-algorithm for convex minimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimizing a differentiable function over a differential manifold
- A \(\mathcal{VU}\)-algorithm for convex minimization
- On $\mathcalVU$-theory for Functions with Primal-Dual Gradient Structure
- The watchdog technique for forcing convergence in algorithms for constrained optimization
- Large-Scale Optimization of Eigenvalues
- Quasi-Newton Methods, Motivation and Theory
- The Geometry of Algorithms with Orthogonality Constraints
- Optimization Techniques on Riemannian Manifolds
- Primal-Dual Gradient Structured Functions: Second-Order Results; Links to Epi-Derivatives and Partly Smooth Functions
- Newton's method on Riemannian manifolds: covariant alpha theory
- 𝒱𝒰-smoothness and proximal point results for some nonconvex functions
- The 𝒰-Lagrangian of a convex function
- The $\U$-Lagrangian of the Maximum Eigenvalue Function
- Active Sets, Nonsmoothness, and Sensitivity
- On Eigenvalue Optimization
This page was built for publication: Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods