Optimal convergence rates for Nesterov acceleration
DOI10.1137/18M1186757zbMATH Open1453.90117arXiv1805.05719OpenAlexW2994888550WikidataQ126559838 ScholiaQ126559838MaRDI QIDQ5206941FDOQ5206941
Authors: Jean-François Aujol, Charles Dossal, Aude Rondepierre
Publication date: 19 December 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.05719
Recommendations
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Convergence rates of the heavy ball method for quasi-strongly convex optimization
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
optimizationrate of convergenceLyapunov functionsordinary differential equationsŁojasiewicz property
Numerical mathematical programming methods (65K05) Convex programming (90C25) Asymptotic properties of solutions to ordinary differential equations (34D05)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- On semi- and subanalytic geometry
- Title not available (Why is that?)
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- On the long time behavior of second order differential equations with asymptotically small dissipation
- Title not available (Why is that?)
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Error bounds and Hölder metric subregularity
- From error bounds to the complexity of first-order descent methods for convex functions
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Asymptotics for a second-order differential equation with nonautonomous damping and an integrable source term
- Asymptotic for the perturbed heavy ball system with vanishing damping term
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Asymptotics for some vibro-impact problems with a linear dissipation term
- On damped second-order gradient systems
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$
- Asymptotic for a second-order evolution equation with convex potential andvanishing damping term
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
Cited In (32)
- Convergence Rates of the Heavy Ball Method for Quasi-strongly Convex Optimization
- Weak and linear convergence of a generalized proximal point algorithm with alternating inertial steps for a monotone inclusion problem
- From the Ravine Method to the Nesterov Method and Vice Versa: A Dynamical System Perspective
- Optimal convergence rate of inertial gradient system with flat geometries and perturbations
- Convergence rates of damped inerial dynamics from multi-degree-of-freedom system
- Inertial Newton algorithms avoiding strict saddle points
- Proximal Gradient Methods for Machine Learning and Imaging
- Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance
- Convergence Rates of Damped Inertial Dynamics under Geometric Conditions and Perturbations
- Convergence Rates of Inertial Primal-Dual Dynamical Methods for Separable Convex Optimization Problems
- Projection methods with alternating inertial steps for variational inequalities: weak and linear convergence
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Convergence rate of a relaxed inertial proximal algorithm for convex minimization
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
- Convergence rates of the heavy-ball method under the Łojasiewicz property
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- The Nesterov accelerated gradient algorithm for auto-regressive exogenous models with random lost measurements: interpolation method and auxiliary model method
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Stochastic differential equations for modeling first order optimization methods
- Double inertial parameters forward-backward splitting method: Applications to compressed sensing, image processing, and SCAD penalty problems
- Optimal decay rates for semi-linear non-autonomous evolution equations with vanishing damping
- Optimal convergence rates for damped inertial gradient dynamics with flat geometries
- Inertial projection and contraction algorithms with larger step sizes for solving quasimonotone variational inequalities
- Generalized proximal point algorithms with correction terms and extrapolation
- Inertial primal-dual dynamics with damping and scaling for linearly constrained convex optimization problems
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions
- Convergence results of two-step inertial proximal point algorithm
- Convergence rates of first- and higher-order dynamics for solving linear ill-posed problems
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- FISTA is an automatic geometrically optimized algorithm for strongly convex functions
This page was built for publication: Optimal convergence rates for Nesterov acceleration
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5206941)