Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
DOI10.1137/21M1430868MaRDI QIDQ5097019FDOQ5097019
Authors: Damek Davis, Mateo Díaz, D. Drusvyatskiy
Publication date: 19 August 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.09815
Recommendations
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- A Newton-based method for nonconvex optimization with fast evasion of saddle points
- Convergence guarantees for a class of non-convex and non-smooth optimization problems
- Proximal methods avoid active strict saddles of weakly convex functions
- Behavior of accelerated gradient methods near critical points of nonconvex functions
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15) Nonsmooth analysis (49J52)
Cites Work
- Variational Analysis
- Introductory lectures on convex optimization. A basic course.
- Inequalities for gamma function ratios
- Convex analysis and nonlinear optimization. Theory and examples
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Limiting subhessians, limiting subjets and their calculus
- Nonconvergence to unstable points in urn models and stochastic approximations
- Cubic regularization of Newton method and its global performance
- A model algorithm for composite nondifferentiable optimization problems
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Generic minimizing behavior in semialgebraic optimization
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Accelerated methods for nonconvex optimization
- Finding approximate local minima faster than gradient descent
- A geometric analysis of phase retrieval
- Future Challenges for Variational Analysis
- First-order methods almost always avoid strict saddle points
- A log-barrier Newton-CG method for bound constrained optimization with complexity guarantees
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Adaptive regularization with cubics on manifolds
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
- The Global Optimization Geometry of Low-Rank Matrix Optimization
- Finding second-order stationary points in constrained minimization: a feasible direction approach
- Proximal methods avoid active strict saddles of weakly convex functions
- Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization
- On Nonconvex Optimization for Machine Learning
Cited In (1)
This page was built for publication: Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5097019)