Global convergence of the Dai-Yuan conjugate gradient method with perturbations
From MaRDI portal
Publication:263134
DOI10.1007/s11424-007-9037-yzbMath1333.65061OpenAlexW2030413086MaRDI QIDQ263134
Publication date: 4 April 2016
Published in: Journal of Systems Science and Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11424-007-9037-y
Cites Work
- Unnamed Item
- Incremental gradient algorithms with stepsizes bounded away from zero
- Descent methods with linesearch in the presence of perturbations
- Convergence analysis of perturbed feasible descent methods
- Further insight into the convergence of the Fletcher-Reeves method
- Pseudogradient adaptation and training algorithms
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Restart procedures for the conjugate gradient method
- An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
- Gradient Convergence in Gradient methods with Errors
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
This page was built for publication: Global convergence of the Dai-Yuan conjugate gradient method with perturbations