Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
From MaRDI portal
Publication:645035
DOI10.1007/s11075-011-9457-6zbMath1277.90149MaRDI QIDQ645035
Nezam Mahdavi-Amiri, Masoud Fatemi, Saman Babaie-Kafaki
Publication date: 8 November 2011
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-011-9457-6
unconstrained optimization; global convergence; hybrid conjugate gradient algorithm; modified BFGS method
90C30: Nonlinear programming
65K10: Numerical optimization and variational techniques
90C52: Methods of reduced gradient type
Related Items
Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations, Two modified scaled nonlinear conjugate gradient methods, A limited memory descent Perry conjugate gradient method, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A modified Perry conjugate gradient method and its global convergence, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- Hybrid conjugate gradient algorithm for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence result for conjugate gradient methods
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- A spectral conjugate gradient method for unconstrained optimization
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization