The convergence rate of a three-term HS method with restart strategy for unconstrained optimization problems
From MaRDI portal
Publication:5495597
Recommendations
- A note on the convergence properties of the original three-term Hestenes-Stiefel method
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Global convergence of two kinds of three-term conjugate gradient methods without line search
- scientific article; zbMATH DE number 1371602
- An efficient adaptive three-term extension of the Hestenes-Stiefel conjugate gradient method
Cites work
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Convergence properties of the Beale-Powell restart algorithm
- Die Konvergenzordnung des Fletcher-Powell-Algorithmus
- Linear Convergence of the Conjugate Gradient Method
- Methods of conjugate gradients for solving linear systems
- On restart procedures for the conjugate gradient method
- On the rate of superlinear convergence of a class of variable metric methods
- Rate of Convergence of Several Conjugate Gradient Algorithms
- Restart procedures for the conjugate gradient method
- Some convergence properties of the conjugate gradient method
- Some descent three-term conjugate gradient methods and their global convergence
- The convergence rate of a restart MFR conjugate gradient method with inexact line search
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
This page was built for publication: The convergence rate of a three-term HS method with restart strategy for unconstrained optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5495597)