Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation

From MaRDI portal
Publication:5116548

DOI10.1137/19M1281368zbMATH Open1448.90070arXiv1709.05191MaRDI QIDQ5116548FDOQ5116548


Authors: E. de Klerk, François Glineur, Adrien B. Taylor Edit this on Wikidata


Publication date: 18 August 2020

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori and Teboulle [Mathematical Programming, 145(1-2):451-482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, 11(7), 1185-1199, 2017]. To illustrate the applicability of the tools, we demonstrate a novel complexity analysis of short step interior point methods using inexact search directions. As an example in this framework, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [PMLR, 48:2520-2528, 2016].


Full work available at URL: https://arxiv.org/abs/1709.05191




Recommendations




Cites Work


Cited In (17)

Uses Software





This page was built for publication: Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5116548)