Two modified HS type conjugate gradient methods for unconstrained optimization problems
DOI10.1016/j.na.2010.09.046zbMath1203.49049OpenAlexW1992047442MaRDI QIDQ611200
Publication date: 14 December 2010
Published in: Nonlinear Analysis. Theory, Methods \& Applications. Series A: Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.na.2010.09.046
unconstrained optimizationglobal convergenceconjugate gradient methodline searchsufficient descent property
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Methods of reduced gradient type (90C52)
Related Items (18)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- Efficient generalized conjugate gradient algorithms. I: Theory
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A three-parameter family of nonlinear conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Two modified HS type conjugate gradient methods for unconstrained optimization problems