Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number

From MaRDI portal
Publication:6306901

arXiv1809.06754MaRDI QIDQ6306901FDOQ6306901


Authors: Zaiyi Chen, Yi Xu, H. Y. Hu, Tianbao Yang Edit this on Wikidata


Publication date: 18 September 2018

Abstract: In this paper, we propose a new SVRG-style acceleated stochastic algorithm for solving a family of non-convex optimization problems whose objective consists of a sum of n smooth functions and a non-smooth convex function. Our major goal is to improve the convergence of SVRG-style stochastic algorithms to stationary points under a setting with a large condition number c - the ratio between the smoothness constant and the negative curvature constant. The proposed algorithm achieves the best known gradient complexity when cgeqOmega(n), which was achieved previously by a SAGA-style accelerated stochastic algorithm. Compared with the SAGA-style accelerated stochastic algorithm, the proposed algorithm is more practical due to its low memory cost that is inherited from previous SVRG-style algorithms. Compared with previous studies on SVRG-style stochastic algorithms, our theory provides much stronger results in terms of (i) reduced gradient complexity under a large condition number; and (ii) that the convergence is proved for a sampled stagewise averaged solution that is selected from all stagewise averaged solutions with increasing sampling probabilities instead of for a uniformly sampled solutions across all iterations.













This page was built for publication: Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6306901)