Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
From MaRDI portal
Publication:2235149
DOI10.1007/s10107-020-01534-wzbMath1478.90097arXiv1807.00387OpenAlexW3040672097MaRDI QIDQ2235149
Publication date: 20 October 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1807.00387
nonconvex optimizationconvergence rateŁojasiewicz exponentKurdyka-Łojasiewicz inequalityinertial algorithm
Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10)
Related Items
Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition ⋮ The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems ⋮ A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem ⋮ On the strong convergence of the trajectories of a Tikhonov regularized second order dynamical system with asymptotically vanishing damping ⋮ Inertial Newton algorithms avoiding strict saddle points ⋮ A forward-backward algorithm with different inertial terms for structured non-convex minimization problems ⋮ Continuous Newton-like inertial dynamics for monotone inclusions ⋮ Inertial proximal incremental aggregated gradient method with linear convergence guarantees ⋮ Convergence rates of damped inerial dynamics from multi-degree-of-freedom system ⋮ Tikhonov Regularization of a Perturbed Heavy Ball System with Vanishing Damping
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Asymptotics for a class of non-linear evolution equations, with applications to geometric problems
- An inertial forward-backward algorithm for monotone inclusions
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Convergence of solutions to second-order gradient-like systems with analytic nonlinearities
- On gradients of functions definable in o-minimal structures
- On the Łojasiewicz--Simon gradient inequality.
- Introductory lectures on convex optimization. A basic course.
- Local convergence of the heavy-ball method and iPiano for non-convex optimization
- Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms
- From error bounds to the complexity of first-order descent methods for convex functions
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Geometric categories and o-minimal structures
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Approaching nonsmooth nonconvex minimization through second-order proximal-gradient dynamical systems
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Newton-like dynamics associated to nonconvex optimization problems
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- On damped second-order gradient systems
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Clarke Subgradients of Stratifiable Functions
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- A forward-backward dynamical approach to the minimization of the sum of a nonsmooth convex with a smooth nonconvex function
- Quasi-Nonexpansive Iterations on the Affine Hull of Orbits: From Mann's Mean Value Algorithm to Inertial Methods
- A Dynamical Approach to an Inertial Forward-Backward Algorithm for Convex Minimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- The Proximal Alternating Direction Method of Multipliers in the Nonconvex Setting: Convergence Analysis and Rates
- Optimal Convergence Rates for Nesterov Acceleration
- A second-order dynamical approach with variable damping to nonconvex smooth minimization
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Some methods of speeding up the convergence of iteration methods
- Convex analysis and monotone operator theory in Hilbert spaces
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
- Heavy-ball method in nonconvex optimization problems