On Hager and Zhang's conjugate gradient method with guaranteed descent
From MaRDI portal
Publication:273329
DOI10.1016/j.amc.2014.02.080zbMath1334.65111OpenAlexW2067422909MaRDI QIDQ273329
Publication date: 21 April 2016
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2014.02.080
global convergenceconjugate gradient methodconjugacy conditiondescent propertiessecant equationspectral scaling
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Spectral scaling BFGS method
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- New quasi-Newton methods for unconstrained optimization problems
- A three-parameter family of nonlinear conjugate gradient methods
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- Updating Quasi-Newton Matrices with Limited Storage
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
This page was built for publication: On Hager and Zhang's conjugate gradient method with guaranteed descent