Adaptive restart of accelerated gradient methods under local quadratic growth condition
From MaRDI portal
Publication:5243267
DOI10.1093/imanum/drz007OpenAlexW2753277175WikidataQ128357361 ScholiaQ128357361MaRDI QIDQ5243267
Publication date: 18 November 2019
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.02300
Related Items (17)
On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent ⋮ Differentially Private Accelerated Optimization Algorithms ⋮ Fast gradient methods for uniformly convex and weakly smooth problems ⋮ Practical perspectives on symplectic accelerated optimization ⋮ FISTA is an automatic geometrically optimized algorithm for strongly convex functions ⋮ A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems ⋮ Faster first-order primal-dual methods for linear programming using restarts and sharpness ⋮ Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient ⋮ A simple nearly optimal restart scheme for speeding up first-order methods ⋮ A review of nonlinear FFT-based computational homogenization methods ⋮ Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions ⋮ Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach ⋮ Sharpness, Restart, and Acceleration ⋮ Restarting the accelerated coordinate descent method with a rough strong convexity estimate ⋮ An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization ⋮ A piecewise conservative method for unconstrained convex optimization ⋮ Universal intermediate gradient method for convex problems with inexact oracle
This page was built for publication: Adaptive restart of accelerated gradient methods under local quadratic growth condition