Oracle complexity separation in convex optimization
From MaRDI portal
Publication:2139268
Abstract: Many convex optimization problems have structured objective function written as a sum of functions with different types of oracles (full gradient, coordinate derivative, stochastic gradient) and different evaluation complexity of these oracles. In the strongly convex case these functions also have different condition numbers, which eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve given accuracy. Motivated by the desire to call more expensive oracle less number of times, in this paper we consider minimization of a sum of two functions and propose a generic algorithmic framework to separate oracle complexities for each component in the sum. As a specific example, for the -strongly convex problem with -smooth function and -smooth function , a special case of our algorithm requires, up to a logarithmic factor, first-order oracle calls for and first-order oracle calls for . Our general framework covers also the setting of strongly convex objectives, the setting when is given by coordinate derivative oracle, and the setting when has a finite-sum structure and is available through stochastic gradient oracle. In the latter two cases we obtain respectively accelerated random coordinate descent and accelerated variance reduction methods with oracle complexity separation.
Recommendations
- A simple method for convex optimization in the oracle model
- Gradient methods for minimizing composite functions
- Fast multiple-splitting algorithms for convex optimization
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- On the oracle complexity of smooth strongly convex minimization
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Accelerated meta-algorithm for convex optimization problems
- Accelerated, parallel, and proximal coordinate descent
- Adaptive Catalyst for Smooth Convex Optimization
- An accelerated directional derivative method for smooth stochastic convex optimization
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- An optimal randomized incremental gradient method
- Catalyst acceleration for first-order convex optimization: from theory to practice
- Conditional gradient sliding for convex optimization
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- First-order methods for convex optimization
- Gradient methods for minimizing composite functions
- Gradient sliding for composite optimization
- Inexact model: a framework for optimization and variational inequalities
- Katyusha: the first direct acceleration of stochastic gradient methods
- Lectures on convex optimization
- Optimal combination of tensor optimization methods
- Random gradient-free minimization of convex functions
- Smooth minimization of non-smooth functions
- Zeroth-order methods for noisy Hölder-gradient functions
Cited in
(8)- On the oracle complexity of smooth strongly convex minimization
- Minimizing oracle-structured composite functions
- A Gradient Complexity Analysis for Minimizing the Sum of Strongly Convex Functions with Varying Condition Numbers
- Accelerated variance-reduced methods for saddle-point problems
- Optimal combination of tensor optimization methods
- Unifying framework for accelerated randomized methods in convex optimization
- A simple method for convex optimization in the oracle model
- Equivalence of Convex Problem Geometry and Computational Complexity in the Separation Oracle Model
This page was built for publication: Oracle complexity separation in convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2139268)