Linearly convergent away-step conditional gradient for non-strongly convex functions

From MaRDI portal
Publication:2364483

DOI10.1007/S10107-016-1069-4zbMATH Open1370.90010arXiv1504.05002OpenAlexW2134882453MaRDI QIDQ2364483FDOQ2364483

Amir Beck, Shimrit Shtern

Publication date: 21 July 2017

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We consider the problem of minimizing a function, which is the sum of a linear function and a composition of a strongly convex function with a linear transformation, over a compact polyhedral set. Jaggi and Lacoste-Julien [14] showed that the conditional gradient method with away steps employed on the aforementioned problem without the additional linear term has linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis that is based on simple duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, (b) does not require a linear-oracle that outputs an extreme point of the linear mapping of the feasible set and (c) depends on a new constant, termed "the vertex-facet distance constant", which is explicitly expressed in terms of the problem's parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate.


Full work available at URL: https://arxiv.org/abs/1504.05002




Recommendations



Cites Work


Cited In (32)





This page was built for publication: Linearly convergent away-step conditional gradient for non-strongly convex functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2364483)