On the asymptotic convergence and acceleration of gradient methods
DOI10.1007/S10915-021-01685-8zbMATH Open1481.90309arXiv1908.07111OpenAlexW3217755387MaRDI QIDQ2053340FDOQ2053340
Authors: Yakui Huang, Yuhong Dai, Hongchao Zhang, Xinwei Liu
Publication date: 29 November 2021
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.07111
Recommendations
unconstrained optimizationquadratic optimizationBarzilai-Borwein methodgradient methodsasymptotic convergencespectral propertyacceleration of gradient methods
Cites Work
- Benchmarking optimization software with performance profiles.
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Step-sizes for the gradient method
- New adaptive stepsize selections in gradient methods
- On the behavior of the gradient norm in the steepest descent method
- Analysis of monotone gradient methods
- On spectral properties of steepest descent methods
- Inexact and Preconditioned Uzawa Algorithms for Saddle Point Problems
- Alternate minimization gradient method
- An efficient gradient method using the Yuan steplength
- A new stepsize for the steepest descent method
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Feasible Barzilai-Borwein-like methods for extreme symmetric eigenvalue problems
- Gradient methods with adaptive step-sizes
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- On the Barzilai and Borwein choice of steplength for the gradient method
- On the asymptotic behaviour of some new gradient methods
- A new gradient method with an optimal stepsize property
- On the asymptotic directions of the s-dimensional optimum gradient method
- Alternate step gradient method*
- On the steepest descent algorithm for quadratic functions
- Title not available (Why is that?)
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- On the steplength selection in gradient methods for unconstrained optimization
- New stepsizes for the gradient method
- A family of spectral gradient methods for optimization
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization
- Coordinated Beamforming for MISO Interference Channel: Complexity Analysis and Efficient Algorithms
- Gradient methods exploiting spectral properties
Cited In (14)
- Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems
- Title not available (Why is that?)
- Fast gradient method for low-rank matrix estimation
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- On initial point selection of the steepest descent algorithm for general quadratic functions
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Alternate step gradient method*
- On the acceleration of the Barzilai-Borwein method
- Accelerated gradient methods with absolute and relative noise in the gradient
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
- A gradient method exploiting the two dimensional quadratic termination property
- On the convergence analysis of the optimized gradient method
- Convex Synthesis of Accelerated Gradient Algorithms
This page was built for publication: On the asymptotic convergence and acceleration of gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2053340)