Iteration Complexity of Fixed-Step Methods by Nesterov and Polyak for Convex Quadratic Functions

From MaRDI portal
Publication:6417697

arXiv2211.10234MaRDI QIDQ6417697FDOQ6417697

Florian Jarre, Melinda Hagedorn

Publication date: 18 November 2022

Abstract: This note considers the momentum method by Polyak and the accelerated gradient method by Nesterov, both without line search but with fixed step length applied to strictly convex quadratic functions assuming that exact gradients are used and appropriate upper and lower bounds for the extreme eigenvalues of the Hessian matrix are known. Simple 2-d-examples show that the Euclidean distance of the iterates to the optimal solution is non-monotone. In this context an explicit bound is derived on the number of iterations needed to guarantee a reduction of the Euclidean distance to the optimal solution by a factor varepsilon. For both methods the bound is optimal up to a constant factor, it complements earlier asymptotically optimal results for the momentum method, and it establishes another link of the momentum method and Nesterov's accelerated gradient method.













This page was built for publication: Iteration Complexity of Fixed-Step Methods by Nesterov and Polyak for Convex Quadratic Functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6417697)