Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
From MaRDI portal
Publication:5495572
DOI10.1080/02331934.2012.693083zbMath1297.65065WikidataQ57952810 ScholiaQ57952810MaRDI QIDQ5495572
Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 5 August 2014
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2012.693083
unconstrained optimization; global convergence; large-scale optimization; conjugate gradient algorithm; numerical result; secant equation; Hestenes-Stiefel; Dai-Yuan; Hagar-Zhang
65K05: Numerical mathematical programming methods
90C25: Convex programming
90C06: Large-scale problems in mathematical programming
90C53: Methods of quasi-Newton type
Related Items
Modified matrix-free methods for solving system of nonlinear equations, Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
Cites Work
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- A class of nonmonotone conjugate gradient methods for nonconvex functions
- A modified BFGS algorithm based on a hybrid secant equation
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Efficient hybrid conjugate gradient techniques
- A class of nonmonotone conjugate gradient methods for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence result for conjugate gradient methods
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A note on global convergence result for conjugate gradient methods
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- CUTEr and SifDec
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- New properties of a nonlinear conjugate gradient method
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A survey of quasi-Newton equations and quasi-Newton methods for optimization