On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
DOI10.3934/eect.2022022OpenAlexW4226453336MaRDI QIDQ2106043
Vyacheslav Kungurtsev, Hedy Attouch, Jalal Fadili
Publication date: 8 December 2022
Published in: Evolution Equations and Control Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.16159
errorsconvergence ratesperturbationLyapunov analysisdamped inertial dynamicsaccelerated convex optimizationHessian driven damping
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10) Management decision making, including multiple objectives (90B50) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Dynamical systems in optimization and economics (37N40)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Introductory lectures on convex optimization. A basic course.
- Asymptotic control and stabilization of nonlinear oscillators with non-isolated equilibria
- Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- On a second order dissipative ODE in Hilbert spaces with an integrable source term
- Continuous Newton-like inertial dynamics for monotone inclusions
- Understanding the acceleration phenomenon via high-resolution differential equations
- A control-theoretic perspective on optimal high-order optimization
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Accelerated and Inexact Forward-Backward Algorithms
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- Strong solutions for parabolic variational inequalities
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Finite Convergence of Proximal-Gradient Inertial Algorithms Combining Dry Friction with Hessian-Driven Damping
- Newton-like Inertial Dynamics and Proximal Algorithms Governed by Maximally Monotone Operators
- An Inertial Newton Algorithm for Deep Learning
- Optimal Convergence Rates for Nesterov Acceleration
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
- Fast optimization via inertial dynamics with closed-loop damping
This page was built for publication: On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping