Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
From MaRDI portal
Publication:2020604
DOI10.1007/s10107-020-01476-3zbMath1465.90062OpenAlexW2914848482MaRDI QIDQ2020604
Charles Dossal, Vassilis Apidopoulos, Jean-François Aujol, Aude Rondepierre
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-020-01476-3
rate of convergenceconvex optimizationgrowth conditionsmooth optimizationŁojasiewicz conditionNesterov accelerationinertial gradient descent algorithm
Numerical mathematical programming methods (65K05) Convex programming (90C25) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10)
Related Items
Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Convergence Rates of the Heavy Ball Method for Quasi-strongly Convex Optimization, Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions, A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems, Fast optimization via inertial dynamics with closed-loop damping, Optimal Convergence Rates for Nesterov Acceleration, Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Error bounds and Hölder metric subregularity
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Convergence to equilibrium for the backward Euler scheme and applications
- On semi- and subanalytic geometry
- Introductory lectures on convex optimization. A basic course.
- From error bounds to the complexity of first-order descent methods for convex functions
- On the proximal gradient algorithm with alternated inertia
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Adaptive restart for accelerated gradient schemes
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Linear convergence of first order methods for non-strongly convex optimization
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- On the long time behavior of second order differential equations with asymptotically small dissipation
- Optimal methods of smooth convex minimization
- New Proximal Point Algorithms for Convex Minimization
- Variational Analysis
- The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$
- Convergence Rates of Inertial Forward-Backward Algorithms
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Optimal Convergence Rates for Nesterov Acceleration
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Some methods of speeding up the convergence of iteration methods