On the acceleration of the Barzilai-Borwein method
From MaRDI portal
Publication:2114827
DOI10.1007/S10589-022-00349-ZzbMATH Open1487.90513arXiv2001.02335OpenAlexW4206006973MaRDI QIDQ2114827FDOQ2114827
Yuhong Dai, Hongchao Zhang, Yakui Huang, Xinwei Liu
Publication date: 15 March 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: The Barzilai-Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to the modest accuracy and has a great advantage of being easily extended to solve a wide class of constrained optimization problems. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing two-dimensional strongly convex quadratic function. Combing with this new stepsize, we develop gradient methods which adaptively take the nonmonotone BB stepsizes and certain monotone stepsizes for minimizing general strongly convex quadratic function. Furthermore, by incorporating nonmonotone line searches and gradient projection techniques, we extend these new gradient methods to solve general smooth unconstrained and bound constrained optimization. Extensive numerical experiments show that our strategies of properly inserting monotone gradient steps into the nonmonotone BB method could significantly improve its performance and the new resulted methods can outperform the most successful gradient decent methods developed in the recent literature.
Full work available at URL: https://arxiv.org/abs/2001.02335
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- The university of Florida sparse matrix collection
- Benchmarking optimization software with performance profiles.
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Methods of conjugate gradients for solving linear systems
- Sparse Reconstruction by Separable Approximation
- New adaptive stepsize selections in gradient methods
- Analysis of monotone gradient methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- An efficient gradient method using the Yuan steplength
- A limited memory steepest descent method
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Feasible Barzilai–Borwein-like methods for extreme symmetric eigenvalue problems
- Gradient methods with adaptive step-sizes
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- On the Barzilai and Borwein choice of steplength for the gradient method
- On the asymptotic directions of the s-dimensional optimum gradient method
- A Barzilai-Borwein conjugate gradient method
- Alternate step gradient method*
- On the steepest descent algorithm for quadratic functions
- On the steplength selection in gradient methods for unconstrained optimization
- Projected Barzilai–Borwein method for large-scale nonnegative image restoration
- On the asymptotic convergence and acceleration of gradient methods
- A family of spectral gradient methods for optimization
- Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization
- Coordinated Beamforming for MISO Interference Channel: Complexity Analysis and Efficient Algorithms
- Gradient methods exploiting spectral properties
Cited In (7)
- Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems
- Delayed weighted gradient method with simultaneous step-sizes for strongly convex optimization
- A class of accelerators for BDF methods
- Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy
- A gradient method exploiting the two dimensional quadratic termination property
- A harmonic framework for stepsize selection in gradient methods
- Barzilai–Borwein-like rules in proximal gradient schemes for ℓ 1 -regularized problems
Uses Software
This page was built for publication: On the acceleration of the Barzilai-Borwein method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2114827)