A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
From MaRDI portal
Publication:1722876
DOI10.1155/2018/5057096zbMath1460.90205OpenAlexW2893252565MaRDI QIDQ1722876
Ahmad Alhawarat, Bakhtawar Baluch, Zabidin Salleh
Publication date: 18 February 2019
Published in: Journal of Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2018/5057096
Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items
Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems, Optimal homotopy asymptotic method-least square for solving nonlinear fractional-order gradient-based dynamic system from an optimization problem, Some three-term conjugate gradient methods with the new direction structure
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
- The convergence of conjugate gradient method with nonmonotone line search
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- A conjugate gradient method with descent direction for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- A conjugate direction algorithm without line searches
- Optimization. Algorithms and consistent approximations
- An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A new family of conjugate gradient methods
- A three-parameter family of nonlinear conjugate gradient methods
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.