First-order optimization algorithms via inertial systems with Hessian driven damping
DOI10.1007/s10107-020-01591-1zbMath1497.37121arXiv1907.10536OpenAlexW3102391325MaRDI QIDQ2133411
Hassan Riahi, Zaki Chbani, Hedy Attouch, Jalal Fadili
Publication date: 29 April 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.10536
time rescalingRavine methodNesterov accelerated gradient methodHessian driven dampinginertial optimization algorithms
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10) Management decision making, including multiple objectives (90B50) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Simulation of dynamical systems (37M05) Dynamical systems in optimization and economics (37N40)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Global convergence of a closed-loop regularized Newton method for solving monotone inclusions in Hilbert spaces
- Universal gradient methods for convex optimization problems
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- On the long time behavior of second order differential equations with asymptotically small dissipation
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- Convergence Rates of Inertial Forward-Backward Algorithms
- Asymptotic for a second-order evolution equation with convex potential andvanishing damping term
- A Dynamical Approach to an Inertial Forward-Backward Algorithm for Convex Minimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Some methods of speeding up the convergence of iteration methods
- An introduction to continuous optimization for imaging
- Convex analysis and monotone operator theory in Hilbert spaces