A conjugate gradient method for unconstrained optimization problems
From MaRDI portal
Publication:963493
DOI10.1155/2009/329623zbMath1190.90223OpenAlexW2140096898WikidataQ58648434 ScholiaQ58648434MaRDI QIDQ963493
Publication date: 20 April 2010
Published in: International Journal of Mathematics and Mathematical Sciences (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/227568
Related Items
A class of one parameter conjugate gradient methods ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems ⋮ A modification of classical conjugate gradient method using strong Wolfe line search ⋮ A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization ⋮ A new hybrid conjugate gradient method of unconstrained optimization methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Optimization. Algorithms and consistent approximations
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- New line search methods for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.