A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
DOI10.1007/s00009-018-1132-xzbMath1402.90211OpenAlexW2799418939MaRDI QIDQ723782
Reza Ghanbari, Saman Babaie-Kafaki
Publication date: 24 July 2018
Published in: Mediterranean Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00009-018-1132-x
unconstrained optimizationglobal convergenceconjugate gradient methodnonlinear programmingmemoryless BFGS method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Numerical computation of matrix norms, conditioning, scaling (65F35)
Related Items (5)
Uses Software
Cites Work
- Unnamed Item
- A modified scaling parameter for the memoryless BFGS updating formula
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- On the sufficient descent property of the Shanno's conjugate gradient method
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Note on Performance Profiles for Benchmarking Software
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- CUTEr and SifDec
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique