A speed restart scheme for a dynamics with Hessian-driven damping
From MaRDI portal
Publication:6086152
DOI10.1007/s10957-023-02290-5arXiv2301.12240OpenAlexW4386447224MaRDI QIDQ6086152
Juan José Maulén, Juan Peypouquet
Publication date: 9 November 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2301.12240
Convex programming (90C25) Numerical optimization and variational techniques (65K10) Newton-type methods (49M15) Optimality conditions for problems involving ordinary differential equations (49K15)
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Gradient methods for minimizing composite functions
- An inertial forward-backward algorithm for monotone inclusions
- Convergence theorems for inertial KM-type algorithms
- Functional analysis, Sobolev spaces and partial differential equations
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- A dynamical system associated with Newton's method for parametric approximations of convex minimization problems
- From error bounds to the complexity of first-order descent methods for convex functions
- Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Convergence of a relaxed inertial forward-backward algorithm for structured monotone inclusions
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Adaptive restart for accelerated gradient schemes
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Linear convergence of first order methods for non-strongly convex optimization
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Monotone Operators and the Proximal Point Algorithm
- First-Order Methods in Optimization
- Asymptotic for a second-order evolution equation with convex potential andvanishing damping term
- Some methods of speeding up the convergence of iteration methods
- Mean Value Methods in Iteration
This page was built for publication: A speed restart scheme for a dynamics with Hessian-driven damping