A modified conjugate gradient method based on the self-scaling memoryless BFGS update
From MaRDI portal
Publication:2672717
DOI10.1007/s11075-021-01220-8zbMath1492.65168OpenAlexW3210179682WikidataQ115146109 ScholiaQ115146109MaRDI QIDQ2672717
Publication date: 13 June 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-021-01220-8
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- On the limited memory BFGS method for large scale optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A double parameter scaled BFGS method for unconstrained optimization
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- The global convergence of a modified BFGS method for nonconvex functions
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- Updating Quasi-Newton Matrices with Limited Storage
- Conjugate Gradient Methods with Inexact Searches
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified conjugate gradient method based on the self-scaling memoryless BFGS update