Sufficient descent conjugate gradient methods for large-scale optimization problems
From MaRDI portal
Publication:2885559
DOI10.1080/00207160.2011.592938zbMath1242.90248MaRDI QIDQ2885559
Xiuyun Zheng, Aiguo Lu, Hong-Wei Liu
Publication date: 23 May 2012
Published in: International Journal of Computer Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00207160.2011.592938
unconstrained optimization; global convergence; conjugate gradient method; Wolfe line search; sufficient descent
Related Items
Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei, A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
Cites Work
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.