Adaptive restart of accelerated gradient methods under local quadratic growth condition
From MaRDI portal
Publication:5243267
Recommendations
- Adaptive restart for accelerated gradient schemes
- Adaptive restart of the optimized gradient method for convex optimization
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Restart of Accelerated First-Order Methods With Linear Convergence Under a Quadratic Functional Growth Condition
- Accelerate stochastic subgradient method by leveraging local growth condition
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- scientific article; zbMATH DE number 4099037
- On restart procedures for the conjugate gradient method
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Stochastic optimization with adaptive restart: a framework for integrated local and global learning
Cited in
(20)- Parameter-free FISTA by adaptive restart and backtracking
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
- Lippmann-Schwinger solvers for the computational homogenization of materials with pores
- Fast gradient methods for uniformly convex and weakly smooth problems
- Practical perspectives on symplectic accelerated optimization
- Universal intermediate gradient method for convex problems with inexact oracle
- A simple nearly optimal restart scheme for speeding up first-order methods
- Faster first-order primal-dual methods for linear programming using restarts and sharpness
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- A review of nonlinear FFT-based computational homogenization methods
- Differentially Private Accelerated Optimization Algorithms
- Sharpness, restart, and acceleration
- A piecewise conservative method for unconstrained convex optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- FISTA is an automatic geometrically optimized algorithm for strongly convex functions
- Perseus: a simple and optimal high-order method for variational inequalities
This page was built for publication: Adaptive restart of accelerated gradient methods under local quadratic growth condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5243267)