On the acceleration of the Barzilai-Borwein method

From MaRDI portal
Publication:2114827

DOI10.1007/S10589-022-00349-ZzbMATH Open1487.90513arXiv2001.02335OpenAlexW4206006973MaRDI QIDQ2114827FDOQ2114827

Yuhong Dai, Hongchao Zhang, Yakui Huang, Xinwei Liu

Publication date: 15 March 2022

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Abstract: The Barzilai-Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to the modest accuracy and has a great advantage of being easily extended to solve a wide class of constrained optimization problems. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing two-dimensional strongly convex quadratic function. Combing with this new stepsize, we develop gradient methods which adaptively take the nonmonotone BB stepsizes and certain monotone stepsizes for minimizing general strongly convex quadratic function. Furthermore, by incorporating nonmonotone line searches and gradient projection techniques, we extend these new gradient methods to solve general smooth unconstrained and bound constrained optimization. Extensive numerical experiments show that our strategies of properly inserting monotone gradient steps into the nonmonotone BB method could significantly improve its performance and the new resulted methods can outperform the most successful gradient decent methods developed in the recent literature.


Full work available at URL: https://arxiv.org/abs/2001.02335





Cites Work


Cited In (7)

Uses Software






This page was built for publication: On the acceleration of the Barzilai-Borwein method

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2114827)