On the asymptotic convergence and acceleration of gradient methods
From MaRDI portal
Publication:2053340
Abstract: We consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate the family of gradient methods, we further exploit spectral properties of stepsizes to break the zigzagging pattern. In particular, a new stepsize is derived by imposing finite termination on minimizing two-dimensional strictly convex quadratic function. It is shown that, for the general quadratic function, the proposed stepsize asymptotically converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, based on this spectral property, we propose a periodic gradient method by incorporating the Barzilai-Borwein method. Numerical comparisons with some recent successful gradient methods show that our new method is very promising.
Recommendations
Cites work
- scientific article; zbMATH DE number 2221955 (Why is no real title available?)
- A family of spectral gradient methods for optimization
- A new gradient method with an optimal stepsize property
- A new stepsize for the steepest descent method
- Alternate minimization gradient method
- Alternate step gradient method*
- An efficient gradient method using the Yuan steplength
- Analysis of monotone gradient methods
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- Benchmarking optimization software with performance profiles.
- Coordinated Beamforming for MISO Interference Channel: Complexity Analysis and Efficient Algorithms
- Feasible Barzilai-Borwein-like methods for extreme symmetric eigenvalue problems
- Gradient methods exploiting spectral properties
- Gradient methods with adaptive step-sizes
- Inexact and Preconditioned Uzawa Algorithms for Saddle Point Problems
- New adaptive stepsize selections in gradient methods
- New stepsizes for the gradient method
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- On spectral properties of steepest descent methods
- On the Barzilai and Borwein choice of steplength for the gradient method
- On the asymptotic behaviour of some new gradient methods
- On the asymptotic directions of the s-dimensional optimum gradient method
- On the behavior of the gradient norm in the steepest descent method
- On the steepest descent algorithm for quadratic functions
- On the steplength selection in gradient methods for unconstrained optimization
- Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- Step-sizes for the gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
Cited in
(15)- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Convex Synthesis of Accelerated Gradient Algorithms
- Alternate step gradient method*
- On initial point selection of the steepest descent algorithm for general quadratic functions
- A family of spectral gradient methods for optimization
- Delayed gradient methods for symmetric and positive definite linear systems
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
- On the acceleration of the Barzilai-Borwein method
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Accelerated gradient methods with absolute and relative noise in the gradient
- scientific article; zbMATH DE number 5018816 (Why is no real title available?)
- Fast gradient method for low-rank matrix estimation
- A gradient method exploiting the two dimensional quadratic termination property
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- On the convergence analysis of the optimized gradient method
This page was built for publication: On the asymptotic convergence and acceleration of gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2053340)