Adaptive restart of accelerated gradient methods under local quadratic growth condition

From MaRDI portal
Publication:5243267

DOI10.1093/imanum/drz007OpenAlexW2753277175WikidataQ128357361 ScholiaQ128357361MaRDI QIDQ5243267

Olivier Fercoq, Zheng Qu

Publication date: 18 November 2019

Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1709.02300




Related Items (17)

On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate AscentDifferentially Private Accelerated Optimization AlgorithmsFast gradient methods for uniformly convex and weakly smooth problemsPractical perspectives on symplectic accelerated optimizationFISTA is an automatic geometrically optimized algorithm for strongly convex functionsA class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problemsFaster first-order primal-dual methods for linear programming using restarts and sharpnessQuadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradientA simple nearly optimal restart scheme for speeding up first-order methodsA review of nonlinear FFT-based computational homogenization methodsConvergence rates of an inertial gradient descent algorithm under growth and flatness conditionsNearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approachSharpness, Restart, and AccelerationRestarting the accelerated coordinate descent method with a rough strong convexity estimateAn inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimizationA piecewise conservative method for unconstrained convex optimizationUniversal intermediate gradient method for convex problems with inexact oracle




This page was built for publication: Adaptive restart of accelerated gradient methods under local quadratic growth condition