Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
Publication:442712
DOI10.1016/J.CAM.2012.01.036zbMath1258.65059OpenAlexW2170270075MaRDI QIDQ442712
Hiroshi Yabe, Yasushi Narushima
Publication date: 3 August 2012
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2012.01.036
unconstrained optimizationglobal convergenceconjugate gradient methodlarge-scaledescent search directionsecant condition
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (18)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization