Global convergence of a nonlinear conjugate gradient method
From MaRDI portal
Publication:410397
DOI10.1155/2011/463087zbMath1235.90181OpenAlexW2047274706WikidataQ58693086 ScholiaQ58693086MaRDI QIDQ410397
Li-Min Zou, Xiao-Qian Song, Jin-Kui Liu
Publication date: 3 April 2012
Published in: Mathematical Problems in Engineering (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2011/463087
Numerical mathematical programming methods (65K05) Numerical solutions to equations with nonlinear operators (65J15) Methods of reduced gradient type (90C52)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New spectral PRP conjugate gradient method for unconstrained optimization
- A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Convergence Properties of Algorithms for Nonlinear Optimization
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: Global convergence of a nonlinear conjugate gradient method