Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
From MaRDI portal
Publication:6042225
Abstract: In a Hilbert space setting, for convex optimization, we show the convergence of the iterates to optimal solutions for a class of accelerated first-order algorithms. They can be interpreted as discrete temporal versions of an inertial dynamic involving both viscous damping and Hessian-driven damping. The asymptotically vanishing viscous damping is linked to the accelerated gradient method of Nesterov while the Hessian driven damping makes it possible to significantly attenuate the oscillations. By treating the Hessian-driven damping as the time derivative of the gradient term, this gives, in discretized form, first-order algorithms. These results complement the previous work of the authors where it was shown the fast convergence of the values, and the fast convergence towards zero of the gradients.
Recommendations
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- A control-theoretic perspective on optimal high-order optimization
- A second-order differential system with Hessian-driven damping; application to non-elastic shock laws
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Accelerated proximal point method for maximally monotone operators
- An inertial Newton algorithm for deep learning
- Asymptotic convergence of nonlinear contraction semigroups in Hilbert space
- Asymptotic for a second-order evolution equation with convex potential and vanishing damping term
- Continuous Newton-like inertial dynamics for monotone inclusions
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convergence rates of inertial forward-backward algorithms
- Convex analysis and monotone operator theory in Hilbert spaces
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Finite convergence of proximal-gradient inertial algorithms combining dry friction with Hessian-driven damping
- Finite-time stabilization of continuous inertial dynamics combining dry friction with Hessian-driven damping
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Introductory lectures on convex optimization. A basic course.
- Newton-like inertial dynamics and proximal algorithms governed by maximally monotone operators
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Optimized first-order methods for smooth convex minimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- Understanding the acceleration phenomenon via high-resolution differential equations
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Weak versus strong convergence of a regularized Newton dynamic for maximal monotone operators
Cited in
(12)- Practical perspectives on symplectic accelerated optimization
- First-order inertial algorithms involving dry friction damping
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms
- Fast continuous dynamics inside the graph of subdifferentials of nonsmooth convex functions
- An ordinary differential equation for modeling Halpern fixed-point Algorithm
- Second order dynamics featuring Tikhonov regularization and time scaling
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Convergence of inertial dynamics driven by sums of potential and nonpotential operators with implicit Newton-like damping
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions
- On the use of the root locus of polynomials with complex coefficients for estimating the basin of attraction for the continuous-time Newton and Householder methods
This page was built for publication: Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6042225)