Towards explicit superlinear convergence rate for SR1

From MaRDI portal
Publication:6038671

DOI10.1007/S10107-022-01865-WzbMATH Open1518.90126arXiv2105.07162OpenAlexW4290068894WikidataQ114228476 ScholiaQ114228476MaRDI QIDQ6038671FDOQ6038671


Authors: Haishan Ye, Dachao Lin, Xiangyu Chang, Zhihua Zhang Edit this on Wikidata


Publication date: 2 May 2023

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We study the convergence rate of the famous Symmetric Rank-1 (SR1) algorithm which has wide applications in different scenarios. Although it has been extensively investigated, SR1 still lacks a non-asymptotic superlinear rate compared with other quasi-Newton methods such as DFP and BFGS. In this paper we address this problem. Inspired by the recent work on explicit convergence analysis of quasi-Newton methods, we obtain the first explicit non-asymptotic rates of superlinear convergence for the vanilla SR1 methods with correction strategy to achieve the numerical stability. Specifically, the vanilla SR1 with the correction strategy achieves the rates of the form left(frac4nln(ekappa)kight)k/2 for general smooth strongly-convex functions where k is the iteration counter, kappa is the condition number of the objective function and n is the dimension of the problem. For the quadratic function, the vanilla SR1 algorithm can find the optima of the objective function at most n steps.


Full work available at URL: https://arxiv.org/abs/2105.07162




Recommendations




Cites Work


Cited In (3)





This page was built for publication: Towards explicit superlinear convergence rate for SR1

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6038671)