A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
From MaRDI portal
Publication:1634798
DOI10.1007/s11590-017-1205-yzbMath1412.90165OpenAlexW2761098424MaRDI QIDQ1634798
Publication date: 18 December 2018
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-017-1205-y
global convergencenonlinear conjugate gradient methodsufficient descent propertymemoryless BFGS method
Related Items (5)
An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ Two diagonal conjugate gradient like methods for unconstrained optimization ⋮ Unnamed Item ⋮ A hybrid three-term conjugate gradient projection method for constrained nonlinear monotone equations with applications ⋮ Least-squares-based three-term conjugate gradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A variant smoothing Newton method for \(P_0\)-\(NCP\) based on a new smoothing function
- New quasi-Newton equation and related methods for unconstrained optimization
- Global pointwise error estimates for uniformly convergent finite element methods for the elliptic boundary layer problem
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method