A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
From MaRDI portal
Publication:4638925
DOI10.1080/10556788.2017.1325885zbMath1397.90361OpenAlexW2615612939MaRDI QIDQ4638925
Publication date: 2 May 2018
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2017.1325885
global convergencenonlinear conjugate gradient methodsufficient descentmemoryless BFGS methodHestense-Stiefel method
Related Items (10)
A three-term CGPM-based algorithm without Lipschitz continuity for constrained nonlinear monotone equations with applications ⋮ Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection ⋮ A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery ⋮ A hybrid three-term conjugate gradient projection method for constrained nonlinear monotone equations with applications ⋮ Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method