An optimal first order method based on optimal quadratic averaging

From MaRDI portal
Publication:4603040

DOI10.1137/16M1072528zbMATH Open1382.65169arXiv1604.06543OpenAlexW2964093568MaRDI QIDQ4603040FDOQ4603040


Authors: Maryam Fazel, Scott Roy, D. Drusvyatskiy Edit this on Wikidata


Publication date: 14 February 2018

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. We show that the same iterate sequence is generated by a scheme that in each iteration computes an optimal average of quadratic lower-models of the function. Indeed, the minimum of the averaged quadratic approaches the true minimum at an optimal rate. This intuitive viewpoint reveals clear connections to the original fast-gradient methods and cutting plane ideas, and leads to limited-memory extensions with improved performance.


Full work available at URL: https://arxiv.org/abs/1604.06543




Recommendations




Cites Work


Cited In (15)

Uses Software





This page was built for publication: An optimal first order method based on optimal quadratic averaging

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4603040)