A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
DOI10.1007/S11075-011-9515-0zbMATH Open1245.65069OpenAlexW1979559025MaRDI QIDQ415335FDOQ415335
Authors: Kairong Wang, Yang Zhang
Publication date: 8 May 2012
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-011-9515-0
Recommendations
- Global convergence of a class of new conjugate gradient methods
- Global convergence of a class of sufficient descent conjugate gradient methods
- Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- scientific article; zbMATH DE number 1159286
- scientific article; zbMATH DE number 1549856
numerical resultsglobal convergenceconjugate gradient methodline searchsufficient descent conditionconjugate-descent-typeDai-Yuan-typeHestenes-StiefelLiu-StoreyPolak-Ribière-Polyak
Cites Work
- Algorithm 851
- CUTE
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Methods of conjugate gradients for solving linear systems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Convergence Properties of Algorithms for Nonlinear Optimization
- A class of globally convergent conjugate gradient methods
- On the convergence property of the DFP algorithm
- New properties of a nonlinear conjugate gradient method
Cited In (5)
- Algorithm 851
- New conjugate gradient-like methods for unconstrained optimization
- A new nonlinear conjugate gradient method with guaranteed global convergence
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Several guaranteed descent conjugate gradient methods for unconstrained optimization
Uses Software
This page was built for publication: A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q415335)