Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
From MaRDI portal
(Redirected from Publication:2235149)
Abstract: We investigate an inertial algorithm of gradient type in connection with the minimization of a nonconvex differentiable function. The algorithm is formulated in the spirit of Nesterov's accelerated convex gradient method. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satisfies the Kurdyka-{L}ojasiewicz property. Further, we provide convergence rates for the generated sequences and the function values formulated in terms of the {L}ojasiewicz exponent.
Recommendations
- A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
- An inertial Tseng's type proximal algorithm for nonsmooth and nonconvex optimization problems
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3371284 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- A dynamical approach to an inertial forward-backward algorithm for convex minimization
- A forward-backward dynamical approach to the minimization of the sum of a nonsmooth convex with a smooth nonconvex function
- A second-order dynamical approach with variable damping to nonconvex smooth minimization
- An inertial forward-backward algorithm for monotone inclusions
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
- Analysis and design of optimization algorithms via integral quadratic constraints
- Approaching nonsmooth nonconvex minimization through second-order proximal-gradient dynamical systems
- Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms
- Asymptotics for a class of non-linear evolution equations, with applications to geometric problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Clarke Subgradients of Stratifiable Functions
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Convergence of solutions to second-order gradient-like systems with analytic nonlinearities
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convex analysis and monotone operator theory in Hilbert spaces
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Fast convex optimization via inertial dynamics with Hessian driven damping
- From error bounds to the complexity of first-order descent methods for convex functions
- Geometric categories and o-minimal structures
- Heavy-ball method in nonconvex optimization problems
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- Introductory lectures on convex optimization. A basic course.
- Local convergence of the heavy-ball method and iPiano for non-convex optimization
- Newton-like dynamics associated to nonconvex optimization problems
- On damped second-order gradient systems
- On gradients of functions definable in o-minimal structures
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- On the Łojasiewicz--Simon gradient inequality.
- Optimal convergence rates for Nesterov acceleration
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Quasi-Nonexpansive Iterations on the Affine Hull of Orbits: From Mann's Mean Value Algorithm to Inertial Methods
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Some methods of speeding up the convergence of iteration methods
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function
- iPiano: inertial proximal algorithm for nonconvex optimization
Cited in
(18)- Convergence Theorems and Convergence Rates for the General Inertial Krasnosel’skiǐ–Mann Algorithm
- Sequence convergence of inexact nonconvex and nonsmooth algorithms with more realistic assumptions
- Convergence rates of damped inerial dynamics from multi-degree-of-freedom system
- Inertial Newton algorithms avoiding strict saddle points
- Fast optimization of charged particle dynamics with damping
- An inertial Tseng's type proximal algorithm for nonsmooth and nonconvex optimization problems
- A Nesterov type algorithm with double Tikhonov regularization: fast convergence of the function values and strong convergence to the minimal norm solution
- On the strong convergence of the trajectories of a Tikhonov regularized second order dynamical system with asymptotically vanishing damping
- Continuous Newton-like inertial dynamics for monotone inclusions
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
- The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
- Solving convex optimization problems via a second order dynamical system with implicit Hessian damping and Tikhonov regularization
- Tikhonov regularization of a perturbed heavy ball system with vanishing damping
- Inertial proximal incremental aggregated gradient method with linear convergence guarantees
- An accelerated stochastic extragradient-like algorithm with new stepsize rules for stochastic variational inequalities
- A forward-backward algorithm with different inertial terms for structured non-convex minimization problems
This page was built for publication: Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2235149)