Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling
DOI10.3934/eect.2021010zbMath1495.37066arXiv2009.07620OpenAlexW4287666062MaRDI QIDQ2119789
Hassan Riahi, Zaki Chbani, Hedy Attouch, Aicha Balhag
Publication date: 30 March 2022
Published in: Evolution Equations and Control Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2009.07620
time rescalingHessian-driven dampingNesterov accelerated gradient methoddamped inertial gradient dynamicsfast convex optimization
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10) Newton-type methods (49M15) Lyapunov and other classical stabilities (Lagrange, Poisson, (L^p, l^p), etc.) in control theory (93D05) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Dynamical systems in optimization and economics (37N40)
Related Items (13)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Introductory lectures on convex optimization. A basic course.
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient
- Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- Approaching nonsmooth nonconvex minimization through second-order proximal-gradient dynamical systems
- Convergence rate of inertial proximal algorithms with general extrapolation and proximal coefficients
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Second Order Forward-Backward Dynamical Systems For Monotone Inclusion Problems
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- The Convergence Problem for Dissipative Autonomous Systems
- Evolution equations for maximal monotone operators: asymptotic analysis in continuous and discrete time
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- New Proximal Point Algorithms for Convex Minimization
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Convergence Rates of Inertial Forward-Backward Algorithms
- Asymptotic for a second-order evolution equation with convex potential andvanishing damping term
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Newton-like Inertial Dynamics and Proximal Algorithms Governed by Maximally Monotone Operators
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Some methods of speeding up the convergence of iteration methods
This page was built for publication: Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling