Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
From MaRDI portal
Publication:5097019
Recommendations
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- A Newton-based method for nonconvex optimization with fast evasion of saddle points
- Convergence guarantees for a class of non-convex and non-smooth optimization problems
- Proximal methods avoid active strict saddles of weakly convex functions
- Behavior of accelerated gradient methods near critical points of nonconvex functions
Cites work
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
- A geometric analysis of phase retrieval
- A log-barrier Newton-CG method for bound constrained optimization with complexity guarantees
- A model algorithm for composite nondifferentiable optimization problems
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Accelerated methods for nonconvex optimization
- Adaptive regularization with cubics on manifolds
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Convex analysis and nonlinear optimization. Theory and examples
- Cubic regularization of Newton method and its global performance
- Finding approximate local minima faster than gradient descent
- Finding second-order stationary points in constrained minimization: a feasible direction approach
- First-order methods almost always avoid strict saddle points
- Future Challenges for Variational Analysis
- Generic minimizing behavior in semialgebraic optimization
- Inequalities for gamma function ratios
- Introductory lectures on convex optimization. A basic course.
- Limiting subhessians, limiting subjets and their calculus
- Nonconvergence to unstable points in urn models and stochastic approximations
- On Nonconvex Optimization for Machine Learning
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Proximal methods avoid active strict saddles of weakly convex functions
- Stochastic model-based minimization of weakly convex functions
- The Global Optimization Geometry of Low-Rank Matrix Optimization
- Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization
- Variational Analysis
Cited in
(4)- Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- Global solutions to nonconvex problems by evolution of Hamilton-Jacobi PDEs
- Stochastic subgradient descent escapes active strict saddles on weakly convex functions
This page was built for publication: Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5097019)