On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
From MaRDI portal
(Redirected from Publication:483732)
Recommendations
- scientific article; zbMATH DE number 7109378
- scientific article; zbMATH DE number 5845974
- Sufficient descent conjugate gradient methods for large-scale optimization problems
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
Cites work
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- A survey of nonlinear conjugate gradient methods
- Algorithm 851
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New properties of a nonlinear conjugate gradient method
- On the limited memory BFGS method for large scale optimization
- Optimization theory and methods. Nonlinear programming
- The conjugate gradient method in extremal problems
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(15)- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- The Hager-Zhang conjugate gradient algorithm for large-scale nonlinear equations
- Sparse signal reconstruction via Hager–Zhang-type schemes for constrained system of nonlinear equations
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Improving the Dai-Liao parameter choices using a fixed point equation
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
This page was built for publication: On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q483732)