Sharpness, Restart, and Acceleration

From MaRDI portal
Publication:5210521

DOI10.1137/18M1224568zbMath1435.90109arXiv1702.03828MaRDI QIDQ5210521

Alexandre d'Aspremont, Vincent Roulet

Publication date: 21 January 2020

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1702.03828




Related Items (18)

Additive Schwarz methods for convex optimization with backtrackingConvergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound conditionWARPd: A Linearly Convergent First-Order Primal-Dual Algorithm for Inverse Problems with Approximate Sharpness ConditionsAccelerated additive Schwarz methods for convex optimization with adaptive restartFast gradient methods for uniformly convex and weakly smooth problemsPractical perspectives on symplectic accelerated optimizationRobust obstacle reconstruction in an elastic mediumAccelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentumRadial duality. II: Applications and algorithmsA class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problemsParking on supercritical Galton-Watson treesFaster first-order primal-dual methods for linear programming using restarts and sharpnessOn optimal universal first-order methods for minimizing heterogeneous sumsA simple nearly optimal restart scheme for speeding up first-order methodsGeneral Hölder smooth convergence rates follow from specialized rates assuming growth boundsNESTANets: stable, accurate and efficient neural networks for analysis-sparse inverse problemsNearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approachRestarting Frank-Wolfe: faster rates under Hölderian error bounds


Uses Software


Cites Work


This page was built for publication: Sharpness, Restart, and Acceleration