Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
DOI10.1080/02331934.2021.2009828zbMATH Open1519.90260arXiv2107.05943OpenAlexW3180018585MaRDI QIDQ6042225FDOQ6042225
Authors: Hédy Attouch, Z. Chbani, Jalal Fadili, H. Riahi
Publication date: 16 May 2023
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2107.05943
Recommendations
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
convergence of iteratestime rescalingNesterov accelerated gradient methodHessian driven dampinginertial optimization algorithms
Cites Work
- Convex analysis and monotone operator theory in Hilbert spaces
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Optimized first-order methods for smooth convex minimization
- Asymptotic convergence of nonlinear contraction semigroups in Hilbert space
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- A second-order differential system with Hessian-driven damping; application to non-elastic shock laws
- Fast convex optimization via inertial dynamics with Hessian driven damping
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Weak versus strong convergence of a regularized Newton dynamic for maximal monotone operators
- Continuous Newton-like inertial dynamics for monotone inclusions
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Newton-like inertial dynamics and proximal algorithms governed by maximally monotone operators
- A control-theoretic perspective on optimal high-order optimization
- Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Asymptotic for a second-order evolution equation with convex potential and vanishing damping term
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convergence rates of inertial forward-backward algorithms
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- Understanding the acceleration phenomenon via high-resolution differential equations
- Accelerated proximal point method for maximally monotone operators
- Finite convergence of proximal-gradient inertial algorithms combining dry friction with Hessian-driven damping
- Finite-time stabilization of continuous inertial dynamics combining dry friction with Hessian-driven damping
- An inertial Newton algorithm for deep learning
Cited In (12)
- Practical perspectives on symplectic accelerated optimization
- First-order inertial algorithms involving dry friction damping
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Fast continuous dynamics inside the graph of subdifferentials of nonsmooth convex functions
- An ordinary differential equation for modeling Halpern fixed-point Algorithm
- Second order dynamics featuring Tikhonov regularization and time scaling
- Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Convergence of inertial dynamics driven by sums of potential and nonpotential operators with implicit Newton-like damping
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions
- On the use of the root locus of polynomials with complex coefficients for estimating the basin of attraction for the continuous-time Newton and Householder methods
This page was built for publication: Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6042225)