Behavior of accelerated gradient methods near critical points of nonconvex functions
From MaRDI portal
Publication:2425178
DOI10.1007/s10107-018-1340-yzbMath1415.90092arXiv1706.07993OpenAlexW2964265968WikidataQ129037120 ScholiaQ129037120MaRDI QIDQ2425178
Publication date: 26 June 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.07993
Related Items
Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Nonconvex Stochastic Optimization: Nonasymptotic Performance Bounds and Momentum-Based Acceleration, Approximating the nearest stable discrete-time system, Convergence of the Momentum Method for Semialgebraic Functions with Locally Lipschitz Gradients, Inertial Newton algorithms avoiding strict saddle points, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Second-Order Guarantees of Distributed Gradient Algorithms, Finding the Nearest Positive-Real System, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, Generalized Momentum-Based Methods: A Hamiltonian Perspective
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Introductory lectures on convex optimization. A basic course.
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Convergence Rates of Inertial Forward-Backward Algorithms
- Heavy-ball method in nonconvex optimization problems