A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
From MaRDI portal
Publication:2515111
DOI10.1016/j.cam.2014.11.058zbMath1309.65074OpenAlexW1970620591MaRDI QIDQ2515111
Ximei Yang, Xiao Liang Dong, Hong-Wei Liu, Yu Bo He
Publication date: 11 February 2015
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2014.11.058
global convergencesufficient descent conditionnumerical comparisonHestenes-Stiefel conjugate gradient methodadaptive conjugacy condition
Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53) Iterative numerical methods for linear systems (65F10)
Related Items
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations ⋮ Comment on ``A new three-term conjugate gradient method for unconstrained problem ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A modified spectral gradient projection-based algorithm for large-scale constrained nonlinear equations with applications in compressive sensing ⋮ A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions ⋮ A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions ⋮ A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems ⋮ A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems ⋮ An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization ⋮ An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method ⋮ New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction ⋮ Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A descent hybrid modification of the Polak–Ribière–Polyak conjugate gradient method ⋮ A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION ⋮ Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- Efficient generalized conjugate gradient algorithms. I: Theory
- Conjugate gradient algorithms in nonconvex optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.