Universal Conditional Gradient Sliding for Convex Optimization
From MaRDI portal
Publication:6071883
DOI10.1137/21M1406234zbMATH Open1529.90059arXiv2103.11026MaRDI QIDQ6071883FDOQ6071883
Authors: Yuyuan Ouyang, Trevor Squires
Publication date: 29 November 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: In this paper, we present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) method, for solving -approximate solutions to convex differentiable optimization problems. For objective functions with H"older continuous gradients, we show that UCGS is able to terminate with -solutions with at most gradient evaluations and linear objective optimizations, where and are the exponent and constant of the H"older condition. Furthermore, UCGS is able to perform such computations without requiring any specific knowledge of the smoothness information and . In the weakly smooth case when , both complexity results improve the current state-of-the-art results on first-order projection-free method achieved by the conditional gradient method. Within the class of sliding-type algorithms, to the best of our knowledge, this is the first time a sliding-type algorithm is able to improve not only the gradient complexity but also the overall complexity for computing an approximate solution. In the smooth case when , UCGS matches the state-of-the-art complexity result but adds more features allowing for practical implementation.
Full work available at URL: https://arxiv.org/abs/2103.11026
Recommendations
convex optimizationconditional gradient methodfirst-order methodconditional gradient slidinguniversal gradient method
Cites Work
- Introductory lectures on convex optimization. A basic course.
- First-order methods in optimization
- Title not available (Why is that?)
- First-order methods of smooth convex optimization with inexact oracle
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Title not available (Why is that?)
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Universal gradient methods for convex optimization problems
- Gradient sliding for composite optimization
- New analysis and results for the Frank-Wolfe method
- Convex optimization: algorithms and complexity
- Lectures on convex optimization
- Conditional gradient sliding for convex optimization
- On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Fast bundle-level methods for unconstrained and ball-constrained convex optimization
- First-order and stochastic optimization methods for machine learning
This page was built for publication: Universal Conditional Gradient Sliding for Convex Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6071883)