Polyak Steps for Adaptive Fast Gradient Method
From MaRDI portal
Publication:6320140
arXiv1906.03056MaRDI QIDQ6320140FDOQ6320140
Alexandre d'Aspremont, Mathieu Barré
Publication date: 7 June 2019
Abstract: Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of the strong convexity parameter . In the case of an unknown , current adaptive techniques are based on restart schemes. When the optimal value is known, these strategies recover the accelerated linear convergence bound without additional grid search. In this paper we propose a new approach that has the same bound without any restart, using an online estimation of strong convexity parameter. We show the robustness of the Fast Gradient Method when using a sequence of upper bounds on . We also present a good candidate for this estimate sequence and detail consistent empirical results.
This page was built for publication: Polyak Steps for Adaptive Fast Gradient Method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6320140)