A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
From MaRDI portal
Publication:415335
DOI10.1007/s11075-011-9515-0zbMath1245.65069OpenAlexW1979559025MaRDI QIDQ415335
Publication date: 8 May 2012
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-011-9515-0
global convergencenumerical resultsconjugate gradient methodline searchsufficient descent conditionconjugate-descent-typeDai-Yuan-typeHestenes-StiefelLiu-StoreyPolak-Ribière-Polyak
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A class of globally convergent conjugate gradient methods
- On the convergence property of the DFP algorithm
- On the limited memory BFGS method for large scale optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Methods of conjugate gradients for solving linear systems
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
This page was built for publication: A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties