Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
From MaRDI portal
Publication:2220664
DOI10.1007/s10107-019-01432-wzbMath1459.65083arXiv1610.03446OpenAlexW2977046017WikidataQ127217092 ScholiaQ127217092MaRDI QIDQ2220664
Dmitriy Drusvyatskiy, Alexander D. Ioffe, Adrian S. Lewis
Publication date: 25 January 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1610.03446
nonsmooth optimizationerror estimationEkeland's principleKurdyka-Łojasiewicz inequalityTaylor-like model
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10)
Related Items
A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods ⋮ Global convergence of model function based Bregman proximal minimization algorithms ⋮ Kurdyka-Łojasiewicz exponent via inf-projection ⋮ The equivalence of three types of error bounds for weakly and approximately convex functions ⋮ Harnessing Structure in Composite Nonsmooth Minimization ⋮ Theoretical aspects in penalty hyperparameters optimization ⋮ Non-smooth non-convex Bregman minimization: unification and new algorithms ⋮ Optimal Convergence Rates for the Proximal Bundle Method ⋮ Calculus rules of the generalized concave Kurdyka-Łojasiewicz property ⋮ Continuous Newton-like Methods Featuring Inertia and Variable Mass ⋮ Unnamed Item ⋮ Stochastic Model-Based Minimization of Weakly Convex Functions ⋮ On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition ⋮ Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods ⋮ Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data ⋮ Composite Optimization by Nonconvex Majorization-Minimization ⋮ Modern regularization methods for inverse problems ⋮ Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems ⋮ Variable Metric Forward-Backward Algorithm for Composite Minimization Problems ⋮ Efficiency of minimizing compositions of convex functions and smooth maps ⋮ Convergence Rate Analysis of a Sequential Convex Programming Method with Line Search for a Class of Constrained Difference-of-Convex Optimization Problems ⋮ Choose Your Path Wisely: Gradient Descent in a Bregman Distance Framework ⋮ Inexact model: a framework for optimization and variational inequalities ⋮ Hölderian Error Bounds and Kurdyka-Łojasiewicz Inequality for the Trust Region Subproblem ⋮ Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Gradient methods for minimizing composite functions
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Quadratic growth and critical point stability of semi-algebraic functions
- Transversality and alternating projections for nonconvex sets
- Accelerating the cubic regularization of Newton's method on convex problems
- On gradients of functions definable in o-minimal structures
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Nonsmooth equations in optimization. Regularity, calculus, methods and applications
- From error bounds to the complexity of first-order descent methods for convex functions
- On the variational principle
- A Gauss-Newton method for convex composite optimization
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Efficiency of minimizing compositions of convex functions and smooth maps
- Cubic regularization of Newton method and its global performance
- Curves of Descent
- On the Evaluation Complexity of Composite Function Minimization with Applications to Nonconvex Nonlinear Programming
- Convergence of an Inexact Algorithm for Composite Nonsmooth Optimization
- Clarke Subgradients of Stratifiable Functions
- Implicit Functions and Solution Mappings
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- On the global convergence of trust region algorithms for unconstrained minimization
- On the superlinear convergence of a trust region algorithm for nonsmooth optimization
- Descent methods for composite nondifferentiable optimization problems
- A model algorithm for composite nondifferentiable optimization problems
- Proximal Subgradients, Marginal Values, and Augmented Lagrangians in Nonconvex Optimization
- Monotone Operators and the Proximal Point Algorithm
- Optimization of lipschitz continuous functions
- Upper-Lipschitz multifunctions and inverse subdifferentials
- Metric regularity and subdifferential calculus
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Prox-regular functions in variational analysis
- Variational Analysis of Regular Mappings
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Composite Optimization by Nonconvex Majorization-Minimization
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- Modified Gauss–Newton scheme with worst case guarantees for global performance