A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization
DOI10.1080/02331934.2014.994625zbMath1335.65054OpenAlexW2020974418MaRDI QIDQ2790872
Wah June Leong, Aliyu Usman Moyi
Publication date: 8 March 2016
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: http://psasir.upm.edu.my/id/eprint/53933/1/Sufficient%20descent%20three%20term%20conjugate%20gradient%20method%20.pdf
algorithmunconstrained optimizationconvergencethree-term conjugate gradient methodnumerical resultlarge-scalesymmetric rank-one updatesufficient descentmemoryless update
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (6)
Cites Work
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- A restarting approach for the symmetric rank one update for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A conjugate direction algorithm without line searches
- Sufficient descent conjugate gradient methods for large-scale optimization problems
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Structured symmetric rank-one method for unconstrained optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Measures for Symmetric Rank-One Updates
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization