Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization
From MaRDI portal
Publication:2112702
DOI10.1016/j.cam.2022.114968OpenAlexW4226197443MaRDI QIDQ2112702
Publication date: 11 January 2023
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.17226
convex optimizationcontrol Lyapunov functionLie derivativeaccelerated optimizationsingular optimal control theory
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Trade models (91B60) Nonlinear differential equations in abstract spaces (34G20)
Cites Work
- Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations
- A convergent process of price adjustment and global Newton methods
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- An optimal control theory for nonlinear optimization
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Optimization Methods for Large-Scale Machine Learning
- Some methods of speeding up the convergence of iteration methods
- The Solution of Nonlinear Systems of Equations by A-Stable Integration Techniques
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization