Oracle complexity separation in convex optimization
From MaRDI portal
Publication:2139268
DOI10.1007/S10957-022-02038-7zbMATH Open1492.90125arXiv2002.02706OpenAlexW3005409728MaRDI QIDQ2139268FDOQ2139268
Pavel Dvurechensky, Anastasiya Ivanova, Dmitry Pasechnyuk, Alexander Tyurin, Alexander V. Gasnikov, Darina Dvinskikh, Evgeniya A. Vorontsova
Publication date: 17 May 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Abstract: Many convex optimization problems have structured objective function written as a sum of functions with different types of oracles (full gradient, coordinate derivative, stochastic gradient) and different evaluation complexity of these oracles. In the strongly convex case these functions also have different condition numbers, which eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve given accuracy. Motivated by the desire to call more expensive oracle less number of times, in this paper we consider minimization of a sum of two functions and propose a generic algorithmic framework to separate oracle complexities for each component in the sum. As a specific example, for the -strongly convex problem with -smooth function and -smooth function , a special case of our algorithm requires, up to a logarithmic factor, first-order oracle calls for and first-order oracle calls for . Our general framework covers also the setting of strongly convex objectives, the setting when is given by coordinate derivative oracle, and the setting when has a finite-sum structure and is available through stochastic gradient oracle. In the latter two cases we obtain respectively accelerated random coordinate descent and accelerated variance reduction methods with oracle complexity separation.
Full work available at URL: https://arxiv.org/abs/2002.02706
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Random gradient-free minimization of convex functions
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- Gradient sliding for composite optimization
- First-order methods for convex optimization
- Lectures on convex optimization
- Conditional gradient sliding for convex optimization
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- An optimal randomized incremental gradient method
- An accelerated directional derivative method for smooth stochastic convex optimization
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated meta-algorithm for convex optimization problems
- Zeroth-order methods for noisy Hölder-gradient functions
- Optimal combination of tensor optimization methods
- Inexact model: a framework for optimization and variational inequalities
- Adaptive Catalyst for Smooth Convex Optimization
Cited In (6)
- On the oracle complexity of smooth strongly convex minimization
- Accelerated variance-reduced methods for saddle-point problems
- Optimal combination of tensor optimization methods
- Unifying framework for accelerated randomized methods in convex optimization
- A simple method for convex optimization in the oracle model
- Equivalence of Convex Problem Geometry and Computational Complexity in the Separation Oracle Model
This page was built for publication: Oracle complexity separation in convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2139268)