Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
DOI10.1007/S00245-023-09997-XzbMATH Open1522.37096arXiv2203.05457OpenAlexW4280511288MaRDI QIDQ6166341FDOQ6166341
H. Riahi, Hédy Attouch, Aicha Balhag, Z. Chbani
Publication date: 6 July 2023
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.05457
Recommendations
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- On the asymptotic convergence and acceleration of gradient methods
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Accelerated extra-gradient descent: a novel accelerated first-order method
- Combining fast inertial dynamics for convex optimization with Tikhonov regularization
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Accelerated iterative regularization via dual diagonal descent
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Gradient descent for Tikhonov functionals with sparsity constraints: theory and numerical comparison of step size rules
- On accelerated proximal gradient algorithms with parameters in extrapolation coefficients
convex optimizationHessian-driven dampinghierarchical minimizationaccelerated gradient methodsdamped inertial dynamicsNesterov accelerated gradient methodTikhonov approximation
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Convex programming (90C25) Management decision making, including multiple objectives (90B50) Dynamical systems in optimization and economics (37N40) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Numerical methods for variational inequalities and related problems (65K15) Numerical solution of ill-posed problems involving ordinary differential equations (65L08) Numerical solution of inverse problems involving ordinary differential equations (65L09)
Cites Work
- Convex analysis and monotone operator theory in Hilbert spaces
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- Title not available (Why is that?)
- Title not available (Why is that?)
- Viscosity Solutions of Minimization Problems
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- Some methods of speeding up the convergence of iteration methods
- A second-order differential system with Hessian-driven damping; application to non-elastic shock laws
- Fast convex optimization via inertial dynamics with Hessian driven damping
- On the long time behavior of second order differential equations with asymptotically small dissipation
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Proximal Point Algorithm Controlled by a Slowly Vanishing Term: Applications to Hierarchical Minimization
- A Continuous Dynamical Newton-Like Approach to Solving Monotone Inclusions
- Asymptotic behavior of coupled dynamical systems with multiscale aspects
- A dynamical approach to convex minimization coupling approximation with the steepest descent method
- Inertial gradient-like dynamical system controlled by a stabilizing term
- Asymptotic selection of viscosity equilibria of semilinear evolution equations by the introduction of a slowly vanishing term
- Strong asymptotic convergence of evolution equations governed by maximal monotone operators with Tikhonov regularization
- Newton-like dynamics and forward-backward methods for structured monotone inclusions in Hilbert spaces
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Asymptotic control and stabilization of nonlinear oscillators with non-isolated equilibria
- On an asymptotically autonomous system with Tikhonov type regularizing term
- A convergence result for nonautonomous subgradient evolution equations and its application to the steepest descent exponential penalty trajectory in linear programming
- A dynamic approach to a proximal-Newton method for monotone inclusions in Hilbert spaces, with complexity \(\mathcal{O}(1/n^2)\)
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Asymptotic behavior of gradient-like dynamical systems involving inertia and multiscale aspects
- Combining fast inertial dynamics for convex optimization with Tikhonov regularization
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- Damped inertial dynamics with vanishing Tikhonov regularization: strong asymptotic convergence towards the minimum norm solution
- The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$
- Fast optimization via inertial dynamics with closed-loop damping
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- Understanding the acceleration phenomenon via high-resolution differential equations
- Asymptotic for a second order evolution equation with damping and regularizing terms
- Title not available (Why is that?)
- Newton-type inertial algorithms for solving monotone equations Governed by sums of potential and nonpotential operators
- From the Ravine Method to the Nesterov Method and Vice Versa: A Dynamical System Perspective
Cited In (13)
- Convex optimization via inertial algorithms with vanishing Tikhonov regularization: fast convergence to the minimum norm solution
- Fast convergence of inertial multiobjective gradient-like systems with asymptotic vanishing damping
- A fast continuous time approach for non-smooth convex optimization using Tikhonov regularization technique
- A Nesterov type algorithm with double Tikhonov regularization: fast convergence of the function values and strong convergence to the minimal norm solution
- Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping
- On the strong convergence of continuous Newton-like inertial dynamics with Tikhonov regularization for monotone inclusions
- Second order dynamics featuring Tikhonov regularization and time scaling
- Fast convergence rate of values with strong convergence of trajectories via inertial dynamics with Tikhonov regularization terms and asymptotically vanishing damping
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
- Fast convex optimization via differential equation with Hessian-driven damping and Tikhonov regularization
- Solving convex optimization problems via a second order dynamical system with implicit Hessian damping and Tikhonov regularization
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions
- Strong Convergence of Trajectories via Inertial Dynamics Combining Hessian-Driven Damping and Tikhonov Regularization for General Convex Minimizations
This page was built for publication: Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6166341)