Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
From MaRDI portal
Publication:5931897
DOI10.1007/BF02669682zbMath0973.65048OpenAlexW2320871099WikidataQ126226942 ScholiaQ126226942MaRDI QIDQ5931897
Defeng Sun, Guang-Hui Liu, Ji-ye Han, Hong-Xia Yin
Publication date: 6 May 2001
Published in: Acta Mathematicae Applicatae Sinica. English Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02669682
unconstrained optimizationconvergencenonlinear conjugate gradient methodsFletcher-Reeves algorithmHestenes-Stiefel algorithmPolak-Ribiére algorithm
Related Items (5)
New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems ⋮ A class of one parameter conjugate gradient methods ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A modified PRP conjugate gradient method
Cites Work
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Line search algorithms with guaranteed sufficient decrease
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications