A Generic Acceleration Framework for Stochastic Composite Optimization

From MaRDI portal
Publication:6319902

arXiv1906.01164MaRDI QIDQ6319902FDOQ6319902


Authors: Andrei Kulunchakov, Julien Mairal Edit this on Wikidata


Publication date: 3 June 2019

Abstract: In this paper, we introduce various mechanisms to obtain accelerated first-order stochastic optimization algorithms when the objective function is convex or strongly convex. Specifically, we extend the Catalyst approach originally designed for deterministic objectives to the stochastic setting. Given an optimization method with mild convergence guarantees for strongly convex problems, the challenge is to accelerate convergence to a noise-dominated region, and then achieve convergence with an optimal worst-case complexity depending on the noise variance of the gradients. A side contribution of our work is also a generic analysis that can handle inexact proximal operators, providing new insights about the robustness of stochastic algorithms when the proximal operator cannot be exactly computed.




Has companion code repository: https://github.com/KuluAndrej/NIPS-2019-code









This page was built for publication: A Generic Acceleration Framework for Stochastic Composite Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6319902)