A scaled three-term conjugate gradient method for unconstrained optimization
DOI10.1186/s13660-016-1239-1zbMath1355.65074OpenAlexW2561591096WikidataQ59459386 ScholiaQ59459386MaRDI QIDQ2374196
Ibrahim Arzuka, Mohd R. Abu Bakar, Wah June Leong
Publication date: 14 December 2016
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-016-1239-1
unconstrained optimizationquasi-Newton methodsnonlinear conjugate gradient methodinverse Hessian approximationnumerical resultDFP update
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- Convergence and stability of line search methods for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Variable Metric Method for Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- Measures for Symmetric Rank-One Updates
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- Quasi-Newton Methods and their Application to Function Minimisation
- Extension of Davidon’s Variable Metric Method to Maximization Under Linear Inequality and Equality Constraints
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: A scaled three-term conjugate gradient method for unconstrained optimization