Performance of first-order methods for smooth convex minimization: a novel approach

From MaRDI portal
Publication:2248759

DOI10.1007/S10107-013-0653-0zbMATH Open1300.90068arXiv1206.3209OpenAlexW1979896658MaRDI QIDQ2248759FDOQ2248759


Authors: Yoel Drori, Marc Teboulle Edit this on Wikidata


Publication date: 27 June 2014

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We introduce a novel approach for analyzing the performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space Rd. Our approach relies on the observation that by definition, the worst case behavior of a black-box optimization method is by itself an optimization problem, which we call the Performance Estimation Problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new numerical bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best performance.


Full work available at URL: https://arxiv.org/abs/1206.3209




Recommendations




Cites Work


Cited In (82)

Uses Software





This page was built for publication: Performance of first-order methods for smooth convex minimization: a novel approach

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2248759)