Oracle complexity separation in convex optimization

From MaRDI portal
Publication:2139268




Abstract: Many convex optimization problems have structured objective function written as a sum of functions with different types of oracles (full gradient, coordinate derivative, stochastic gradient) and different evaluation complexity of these oracles. In the strongly convex case these functions also have different condition numbers, which eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve given accuracy. Motivated by the desire to call more expensive oracle less number of times, in this paper we consider minimization of a sum of two functions and propose a generic algorithmic framework to separate oracle complexities for each component in the sum. As a specific example, for the mu-strongly convex problem minxinmathbbRnh(x)+g(x) with Lh-smooth function h and Lg-smooth function g, a special case of our algorithm requires, up to a logarithmic factor, O(sqrtLh/mu) first-order oracle calls for h and O(sqrtLg/mu) first-order oracle calls for g. Our general framework covers also the setting of strongly convex objectives, the setting when g is given by coordinate derivative oracle, and the setting when g has a finite-sum structure and is available through stochastic gradient oracle. In the latter two cases we obtain respectively accelerated random coordinate descent and accelerated variance reduction methods with oracle complexity separation.



Cites work







This page was built for publication: Oracle complexity separation in convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2139268)