Universal Conditional Gradient Sliding for Convex Optimization

From MaRDI portal
Publication:6071883

DOI10.1137/21M1406234zbMATH Open1529.90059arXiv2103.11026MaRDI QIDQ6071883FDOQ6071883


Authors: Yuyuan Ouyang, Trevor Squires Edit this on Wikidata


Publication date: 29 November 2023

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: In this paper, we present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) method, for solving varepsilon-approximate solutions to convex differentiable optimization problems. For objective functions with H"older continuous gradients, we show that UCGS is able to terminate with varepsilon-solutions with at most O((MuDX1+u/varepsilon)2/(1+3u)) gradient evaluations and O((MuDX1+u/varepsilon)4/(1+3u)) linear objective optimizations, where uin(0,1] and Mu>0 are the exponent and constant of the H"older condition. Furthermore, UCGS is able to perform such computations without requiring any specific knowledge of the smoothness information u and Mu. In the weakly smooth case when uin(0,1), both complexity results improve the current state-of-the-art O((MuDX1+u/varepsilon)1/u) results on first-order projection-free method achieved by the conditional gradient method. Within the class of sliding-type algorithms, to the best of our knowledge, this is the first time a sliding-type algorithm is able to improve not only the gradient complexity but also the overall complexity for computing an approximate solution. In the smooth case when u=1, UCGS matches the state-of-the-art complexity result but adds more features allowing for practical implementation.


Full work available at URL: https://arxiv.org/abs/2103.11026




Recommendations




Cites Work






This page was built for publication: Universal Conditional Gradient Sliding for Convex Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6071883)