A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
DOI10.3934/jimo.2016038zbMath1365.65159OpenAlexW2408295463MaRDI QIDQ2628174
Saman Babaie-Kafaki, Reza Ghanbari
Publication date: 12 June 2017
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2016038
unconstrained optimizationglobal convergencelarge-scale optimizationconjugate gradient methodBFGS updatesufficient descent property
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (5)
Uses Software
Cites Work
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update