Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
DOI10.1080/02331934.2012.693083zbMATH Open1297.65065OpenAlexW2076997099WikidataQ57952810 ScholiaQ57952810MaRDI QIDQ5495572FDOQ5495572
Authors: Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 5 August 2014
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2012.693083
Recommendations
- Two modified hybrid conjugate gradient methods based on a hybrid secant equation
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- A hybrid nonlinear conjugate gradient method
- Two new conjugate gradient methods based on modified secant equations
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
global convergencenumerical resultunconstrained optimizationconjugate gradient algorithmsecant equationlarge-scale optimizationHestenes-StiefelDai-YuanHagar-Zhang
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Methods of quasi-Newton type (90C53)
Cites Work
- Algorithm 851
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- CUTEr and SifDec
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Function minimization by conjugate gradients
- Technical Note—A Modified Conjugate Gradient Algorithm
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Convergence properties of the Fletcher-Reeves method
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- A modified BFGS algorithm based on a hybrid secant equation
- Two new conjugate gradient methods based on modified secant equations
- Conjugate Gradient Methods with Inexact Searches
- New properties of a nonlinear conjugate gradient method
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- A class of nonmonotone conjugate gradient methods for unconstrained optimization
- A class of nonmonotone conjugate gradient methods for nonconvex functions
- A note on global convergence result for conjugate gradient methods
Cited In (9)
- Two modified hybrid conjugate gradient methods based on a hybrid secant equation
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- A modified BFGS algorithm based on a hybrid secant equation
- Two new conjugate gradient methods based on modified secant equations
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter
- Modified matrix-free methods for solving system of nonlinear equations
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- Hybrid modification of accelerated double direction method
This page was built for publication: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5495572)