On the acceleration of the Barzilai-Borwein method
From MaRDI portal
Publication:2114827
DOI10.1007/S10589-022-00349-ZzbMATH Open1487.90513OpenAlexW4206006973MaRDI QIDQ2114827FDOQ2114827
Authors: Yakui Huang, Yuhong Dai, Hongchao Zhang, Xinwei Liu
Publication date: 15 March 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: The Barzilai-Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to the modest accuracy and has a great advantage of being easily extended to solve a wide class of constrained optimization problems. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing two-dimensional strongly convex quadratic function. Combing with this new stepsize, we develop gradient methods which adaptively take the nonmonotone BB stepsizes and certain monotone stepsizes for minimizing general strongly convex quadratic function. Furthermore, by incorporating nonmonotone line searches and gradient projection techniques, we extend these new gradient methods to solve general smooth unconstrained and bound constrained optimization. Extensive numerical experiments show that our strategies of properly inserting monotone gradient steps into the nonmonotone BB method could significantly improve its performance and the new resulted methods can outperform the most successful gradient decent methods developed in the recent literature.
Full work available at URL: https://arxiv.org/abs/2001.02335
Recommendations
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- Gradient methods for large scale convex quadratic functions
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- On the asymptotic behaviour of some new gradient methods
- Gradient methods with adaptive step-sizes
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Barzilai-Borwein conjugate gradient method
- A family of spectral gradient methods for optimization
- A limited memory steepest descent method
- A new stepsize for the steepest descent method
- Alternate minimization gradient method
- Alternate step gradient method*
- An efficient gradient method using the Yuan steplength
- Analysis of monotone gradient methods
- Benchmarking optimization software with performance profiles.
- Coordinated Beamforming for MISO Interference Channel: Complexity Analysis and Efficient Algorithms
- Feasible Barzilai-Borwein-like methods for extreme symmetric eigenvalue problems
- Gradient Method with Retards and Generalizations
- Gradient methods exploiting spectral properties
- Gradient methods with adaptive step-sizes
- Methods of conjugate gradients for solving linear systems
- New adaptive stepsize selections in gradient methods
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- On the Barzilai and Borwein choice of steplength for the gradient method
- On the asymptotic convergence and acceleration of gradient methods
- On the asymptotic directions of the s-dimensional optimum gradient method
- On the steepest descent algorithm for quadratic functions
- On the steplength selection in gradient methods for unconstrained optimization
- Projected Barzilai-Borwein method for large-scale nonnegative image restoration
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization
- Sparse Reconstruction by Separable Approximation
- Step-sizes for the gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The University of Florida sparse matrix collection
- Two-Point Step Size Gradient Methods
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
Cited In (31)
- Extended Barzilai-Borwein method for unconstrained minimization problems
- On the asymptotic behaviour of some new gradient methods
- Direct search conjugate gradient method based on adaptive Barzilai-Borwein step-size
- Two novel gradient methods with optimal step sizes
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- New adaptive stepsize selections in gradient methods
- Efficient methods for convex problems with Bregman Barzilai-Borwein step sizes
- New adaptive Barzilai-Borwein step size and its application in solving large-scale optimization problems
- Trust region BB methods for unconstrained minimization
- Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization
- Delayed weighted gradient method with simultaneous step-sizes for strongly convex optimization
- A class of accelerators for BDF methods
- Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy
- A family of Barzilai-Borwein steplengths from the viewpoint of scaled total least squares
- Gradient methods with adaptive step-sizes
- Gradient methods for large scale convex quadratic functions
- Alternate step gradient method*
- A new modified Barzilai-Borwein gradient method for the quadratic minimization problem
- Delayed gradient methods for symmetric and positive definite linear systems
- Title not available (Why is that?)
- Stabilized Barzilai-Borwein method
- On projected alternating BB methods for variational inequalities
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- A new gradient method via least change secant update
- A gradient method exploiting the two dimensional quadratic termination property
- Spectral properties of Barzilai-Borwein rules in solving singly linearly constrained optimization problems subject to lower and upper bounds
- A harmonic framework for stepsize selection in gradient methods
- Barzilai–Borwein-like rules in proximal gradient schemes for ℓ 1 -regularized problems
- Monotone projected gradient methods for large-scale box-constrained quadratic programming
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- A new gradient method via quasi-Cauchy relation which guarantees descent
Uses Software
This page was built for publication: On the acceleration of the Barzilai-Borwein method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2114827)