The global convergence of a new mixed conjugate gradient method for unconstrained optimization
From MaRDI portal
Publication:1952993
DOI10.1155/2012/932980zbMath1268.65086OpenAlexW1999180899WikidataQ58906958 ScholiaQ58906958MaRDI QIDQ1952993
Publication date: 3 June 2013
Published in: Journal of Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2012/932980
unconstrained optimizationglobal convergencenumerical experimentsWolfe line searchnonlinear conjugate gradient method
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Efficient hybrid conjugate gradient techniques
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
This page was built for publication: The global convergence of a new mixed conjugate gradient method for unconstrained optimization