A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods
DOI10.1137/21M1465445MaRDI QIDQ5883316
Paulo J. S. Silva, Unnamed Author, Mikhail V. Solodov, Claudia A. Sagastizábal
Publication date: 30 March 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
error boundlinear convergencebundle methodsdescent methodsproximal gradient methodmodel-based methodsweak convexityproximal descent
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Methods of successive quadratic programming type (90C55)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A proximal method for composite minimization
- Composite proximal bundle method
- Computing proximal points of nonconvex functions
- On gradients of functions definable in o-minimal structures
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Error bounds in mathematical programming
- Convergence analysis of perturbed feasible descent methods
- Local behavior of an iterative framework for generalized equations with nonisolated solutions
- Convergence rate analysis of iteractive algorithms for solving variational inequality problems
- Efficiency of proximal bundle methods
- A unified approach to error bounds for structured convex optimization problems
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Convergence of non-smooth descent methods using the Kurdyka-Łojasiewicz inequality
- Methods of descent for nondifferentiable optimization
- Efficiency of minimizing compositions of convex functions and smooth maps
- Rate of convergence of the bundle method
- The equivalence of three types of error bounds for weakly and approximately convex functions
- METRIC REGULARITY—A SURVEY PART 1. THEORY
- A Redistributed Proximal Bundle Method for Nonconvex Optimization
- Clarke Subgradients of Stratifiable Functions
- A modification and an extension of Lemarechal’s algorithm for nonsmooth minimization
- Semismooth and Semiconvex Functions in Constrained Optimization
- Upper-Lipschitz multifunctions and inverse subdifferentials
- First-Order Methods in Optimization
- Minimizing Nonconvex Nonsmooth Functions via Cutting Planes and Proximity Control
- Modified Projection-Type Methods for Monotone Variational Inequalities
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
- Newton-Type Methods for Optimization and Variational Problems
- On a Class of Nonsmooth Composite Functions
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
- Gobally convergent variable metric method for nonconvex nondifferentiable unconstrained minimization
- A proximal bundle method for nonsmooth nonconvex functions with inexact information
This page was built for publication: A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods