On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
DOI10.1007/S10288-014-0255-6zbMATH Open1307.65084OpenAlexW2015288657MaRDI QIDQ483732FDOQ483732
Authors: Saman Babaie-Kafaki
Publication date: 17 December 2014
Published in: 4OR (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10288-014-0255-6
Recommendations
- scientific article; zbMATH DE number 7109378
- scientific article; zbMATH DE number 5845974
- Sufficient descent conjugate gradient methods for large-scale optimization problems
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
global convergenceunconstrained optimizationeigenvalueconjugate gradient algorithmsufficient descent conditionlarge-scale problemHager-Zhang nonlinear conjugate gradient method
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Algorithm 851
- On the limited memory BFGS method for large scale optimization
- Updating Quasi-Newton Matrices with Limited Storage
- Optimization theory and methods. Nonlinear programming
- Title not available (Why is that?)
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- A survey of nonlinear conjugate gradient methods
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- New properties of a nonlinear conjugate gradient method
Cited In (15)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- The Hager-Zhang conjugate gradient algorithm for large-scale nonlinear equations
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- Sparse signal reconstruction via Hager–Zhang-type schemes for constrained system of nonlinear equations
- Improving the Dai-Liao parameter choices using a fixed point equation
Uses Software
This page was built for publication: On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q483732)