On the asymptotic convergence and acceleration of gradient methods

From MaRDI portal
Publication:2053340

DOI10.1007/S10915-021-01685-8zbMATH Open1481.90309arXiv1908.07111OpenAlexW3217755387MaRDI QIDQ2053340FDOQ2053340


Authors: Yakui Huang, Yuhong Dai, Hongchao Zhang, Xinwei Liu Edit this on Wikidata


Publication date: 29 November 2021

Published in: Journal of Scientific Computing (Search for Journal in Brave)

Abstract: We consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate the family of gradient methods, we further exploit spectral properties of stepsizes to break the zigzagging pattern. In particular, a new stepsize is derived by imposing finite termination on minimizing two-dimensional strictly convex quadratic function. It is shown that, for the general quadratic function, the proposed stepsize asymptotically converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, based on this spectral property, we propose a periodic gradient method by incorporating the Barzilai-Borwein method. Numerical comparisons with some recent successful gradient methods show that our new method is very promising.


Full work available at URL: https://arxiv.org/abs/1908.07111




Recommendations




Cites Work


Cited In (14)





This page was built for publication: On the asymptotic convergence and acceleration of gradient methods

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2053340)