On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
From MaRDI portal
Publication:483732
DOI10.1007/s10288-014-0255-6zbMath1307.65084OpenAlexW2015288657MaRDI QIDQ483732
Publication date: 17 December 2014
Published in: 4OR (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10288-014-0255-6
unconstrained optimizationglobal convergenceeigenvalueconjugate gradient algorithmlarge-scale problemsufficient descent conditionHager-Zhang nonlinear conjugate gradient method
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (10)
A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach ⋮ Improving the Dai-Liao parameter choices using a fixed point equation ⋮ Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations ⋮ On optimality of two adaptive choices for the parameter of Dai-Liao method ⋮ A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation ⋮ An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix ⋮ Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- On the limited memory BFGS method for large scale optimization
- Optimization theory and methods. Nonlinear programming
- Algorithm 851
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New properties of a nonlinear conjugate gradient method
This page was built for publication: On the sufficient descent condition of the Hager-Zhang conjugate gradient methods