Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
DOI10.1007/S10107-020-01534-WzbMATH Open1478.90097arXiv1807.00387OpenAlexW3040672097MaRDI QIDQ2235149FDOQ2235149
Authors: Szilárd László
Publication date: 20 October 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1807.00387
Recommendations
- A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
- An inertial Tseng's type proximal algorithm for nonsmooth and nonconvex optimization problems
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
nonconvex optimizationconvergence rateinertial algorithmŁojasiewicz exponentKurdyka-Łojasiewicz inequality
Numerical optimization and variational techniques (65K10) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- iPiano: inertial proximal algorithm for nonconvex optimization
- Convex analysis and monotone operator theory in Hilbert spaces
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Analysis and design of optimization algorithms via integral quadratic constraints
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Geometric categories and o-minimal structures
- Clarke Subgradients of Stratifiable Functions
- On the Łojasiewicz--Simon gradient inequality.
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- Title not available (Why is that?)
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
- Asymptotics for a class of non-linear evolution equations, with applications to geometric problems
- On gradients of functions definable in o-minimal structures
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function
- A dynamical approach to an inertial forward-backward algorithm for convex minimization
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Some methods of speeding up the convergence of iteration methods
- An inertial forward-backward algorithm for monotone inclusions
- Fast convex optimization via inertial dynamics with Hessian driven damping
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Heavy-ball method in nonconvex optimization problems
- From error bounds to the complexity of first-order descent methods for convex functions
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Convergence of solutions to second-order gradient-like systems with analytic nonlinearities
- Approaching nonsmooth nonconvex minimization through second-order proximal-gradient dynamical systems
- A forward-backward dynamical approach to the minimization of the sum of a nonsmooth convex with a smooth nonconvex function
- Quasi-Nonexpansive Iterations on the Affine Hull of Orbits: From Mann's Mean Value Algorithm to Inertial Methods
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Local convergence of the heavy-ball method and iPiano for non-convex optimization
- Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms
- A second-order dynamical approach with variable damping to nonconvex smooth minimization
- On damped second-order gradient systems
- The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Optimal convergence rates for Nesterov acceleration
- Newton-like dynamics associated to nonconvex optimization problems
Cited In (18)
- Convergence Theorems and Convergence Rates for the General Inertial Krasnosel’skiǐ–Mann Algorithm
- Sequence convergence of inexact nonconvex and nonsmooth algorithms with more realistic assumptions
- Convergence rates of damped inerial dynamics from multi-degree-of-freedom system
- Inertial Newton algorithms avoiding strict saddle points
- Fast optimization of charged particle dynamics with damping
- An inertial Tseng's type proximal algorithm for nonsmooth and nonconvex optimization problems
- A Nesterov type algorithm with double Tikhonov regularization: fast convergence of the function values and strong convergence to the minimal norm solution
- On the strong convergence of the trajectories of a Tikhonov regularized second order dynamical system with asymptotically vanishing damping
- Continuous Newton-like inertial dynamics for monotone inclusions
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
- The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
- Solving convex optimization problems via a second order dynamical system with implicit Hessian damping and Tikhonov regularization
- Tikhonov regularization of a perturbed heavy ball system with vanishing damping
- Inertial proximal incremental aggregated gradient method with linear convergence guarantees
- An accelerated stochastic extragradient-like algorithm with new stepsize rules for stochastic variational inequalities
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
This page was built for publication: Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2235149)