A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
From MaRDI portal
Publication:3458811
DOI10.1080/10556788.2014.966825zbMath1328.65133OpenAlexW2065678922WikidataQ57952636 ScholiaQ57952636MaRDI QIDQ3458811
Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 28 December 2015
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2014.966825
Related Items (10)
A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization ⋮ A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems ⋮ A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ Least-squares-based three-term conjugate gradient methods ⋮ Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
Cites Work
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- A modified BFGS algorithm based on a hybrid secant equation
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- CUTEr and SifDec
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach