Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization
From MaRDI portal
Publication:5199047
Cites work
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 3363347 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new family of conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- Convergence Conditions for Ascent Methods
- Convergence Properties of Algorithms for Nonlinear Optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Efficient hybrid conjugate gradient techniques
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence of conjugate gradient methods without line search
- Methods of conjugate gradients for solving linear systems
- New line search methods for unconstrained optimization
- Restart procedures for the conjugate gradient method
Cited in
(3)
This page was built for publication: Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5199047)