Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
DOI10.1007/S10107-020-01476-3zbMATH Open1465.90062OpenAlexW2914848482MaRDI QIDQ2020604FDOQ2020604
Authors: Vassilis Apidopoulos, Jean-François Aujol, Charles Dossal, Aude Rondepierre
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-020-01476-3
Recommendations
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Convergence rates of damped inertial dynamics under geometric conditions and perturbations
- Optimal convergence rates for Nesterov acceleration
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
rate of convergenceconvex optimizationsmooth optimizationgrowth conditionNesterov accelerationinertial gradient descent algorithmŁojasiewicz condition
Numerical mathematical programming methods (65K05) Convex programming (90C25) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Variational Analysis
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- Title not available (Why is that?)
- Analysis and design of optimization algorithms via integral quadratic constraints
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- On semi- and subanalytic geometry
- Title not available (Why is that?)
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- Optimal methods of smooth convex minimization
- Some methods of speeding up the convergence of iteration methods
- On the long time behavior of second order differential equations with asymptotically small dissipation
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Error bounds and Hölder metric subregularity
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- New Proximal Point Algorithms for Convex Minimization
- Convergence to equilibrium for the backward Euler scheme and applications
- On the proximal gradient algorithm with alternated inertia
- From error bounds to the complexity of first-order descent methods for convex functions
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Linear convergence of first order methods for non-strongly convex optimization
- Error bounds, quadratic growth, and linear convergence of proximal methods
- Title not available (Why is that?)
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Activity identification and local linear convergence of forward-backward-type methods
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convergence rates of inertial forward-backward algorithms
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Optimal convergence rates for Nesterov acceleration
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
Cited In (13)
- A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems
- Fast optimization via inertial dynamics with closed-loop damping
- Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition
- A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems
- Optimal convergence rates for Nesterov acceleration
- Convergence rates of the heavy ball method for quasi-strongly convex optimization
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Stochastic differential equations for modeling first order optimization methods
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
- Optimal convergence rates for damped inertial gradient dynamics with flat geometries
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions
- Convergence rates of damped inertial dynamics under geometric conditions and perturbations
This page was built for publication: Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2020604)