A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization
From MaRDI portal
Publication:2854330
DOI10.1080/01630563.2013.777350zbMath1274.90287OpenAlexW2019460058MaRDI QIDQ2854330
Publication date: 18 October 2013
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2013.777350
Related Items
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations ⋮ A new adaptive trust region algorithm for optimization problems ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations ⋮ A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations ⋮ A quasi-Newton algorithm for large-scale nonlinear equations ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ A modified three-term conjugate gradient method with sufficient descent property ⋮ Wei–Yao–Liu conjugate gradient projection algorithm for nonlinear monotone equations with convex constraints ⋮ Behavior of the combination of PRP and HZ methods for unconstrained optimization ⋮ Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems ⋮ A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations ⋮ A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
Cites Work
- Convergence analysis of a modified BFGS method on convex minimizations
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- A conjugate gradient method for unconstrained optimization problems
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.