Adaptive restart of accelerated gradient methods under local quadratic growth condition
From MaRDI portal
Publication:5243267
DOI10.1093/IMANUM/DRZ007OpenAlexW2753277175WikidataQ128357361 ScholiaQ128357361MaRDI QIDQ5243267FDOQ5243267
Authors: Olivier Fercoq, Zheng Qu
Publication date: 18 November 2019
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.02300
Recommendations
- Adaptive restart for accelerated gradient schemes
- Adaptive restart of the optimized gradient method for convex optimization
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Restart of Accelerated First-Order Methods With Linear Convergence Under a Quadratic Functional Growth Condition
- Accelerate stochastic subgradient method by leveraging local growth condition
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- scientific article; zbMATH DE number 4099037
- On restart procedures for the conjugate gradient method
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Stochastic optimization with adaptive restart: a framework for integrated local and global learning
Cited In (20)
- Lippmann-Schwinger solvers for the computational homogenization of materials with pores
- A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems
- A review of nonlinear FFT-based computational homogenization methods
- Practical perspectives on symplectic accelerated optimization
- Universal intermediate gradient method for convex problems with inexact oracle
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
- Faster first-order primal-dual methods for linear programming using restarts and sharpness
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Sharpness, restart, and acceleration
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Parameter-free FISTA by adaptive restart and backtracking
- A simple nearly optimal restart scheme for speeding up first-order methods
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization
- A piecewise conservative method for unconstrained convex optimization
- Perseus: a simple and optimal high-order method for variational inequalities
- Fast gradient methods for uniformly convex and weakly smooth problems
- FISTA is an automatic geometrically optimized algorithm for strongly convex functions
- Differentially Private Accelerated Optimization Algorithms
This page was built for publication: Adaptive restart of accelerated gradient methods under local quadratic growth condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5243267)