Convergence properties of the Beale-Powell restart algorithm
From MaRDI portal
Publication:1286637
Recommendations
Cites Work
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 3439906 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- Alternative proofs of the convergence properties of the conjugate- gradient method
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Convergence properties of the Fletcher-Reeves method
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Line search algorithms with guaranteed sufficient decrease
- Linear Convergence of the Conjugate Gradient Method
- Rate of Convergence of Several Conjugate Gradient Algorithms
- Restart procedures for the conjugate gradient method
- The conjugate gradient method in extremal problems
Cited In (16)
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Nonlinear conjugate gradient for smooth convex functions
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Convergence of descent method without line search
- A new super-memory gradient method with curve search rule
- On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
- Rate of convergence of a restarted CG-DESCENT method
- An efficient mixed conjugate gradient method for solving unconstrained optimisation problems
- The convergence rate of a three-term HS method with restart strategy for unconstrained optimization problems
- Globally linearly convergent nonlinear conjugate gradients without Wolfe line search
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Adaptive trust-region method on Riemannian manifold
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- The convergence rate of a restart MFR conjugate gradient method with inexact line search
This page was built for publication: Convergence properties of the Beale-Powell restart algorithm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1286637)