Convergence properties of the Beale-Powell restart algorithm
From MaRDI portal
Publication:1286637
DOI10.1007/BF02871976zbMATH Open0919.65042OpenAlexW2028736557MaRDI QIDQ1286637FDOQ1286637
Publication date: 23 August 1999
Published in: Science in China. Series A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02871976
conjugate gradientglobal convergenceunconstrained optimizationlarge-scale unconstrained optimizationlinear searchBeale-Powell restart algorithm
Cites Work
- Function minimization by conjugate gradients
- Line search algorithms with guaranteed sufficient decrease
- Title not available (Why is that?)
- Title not available (Why is that?)
- Restart procedures for the conjugate gradient method
- Title not available (Why is that?)
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Convergence properties of the Fletcher-Reeves method
- Rate of Convergence of Several Conjugate Gradient Algorithms
- Linear Convergence of the Conjugate Gradient Method
- Alternative proofs of the convergence properties of the conjugate- gradient method
Cited In (16)
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- An Efficient Mixed Conjugate Gradient Method for Solving Unconstrained Optimisation Problems
- Nonlinear conjugate gradient for smooth convex functions
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Convergence of descent method without line search
- A new super-memory gradient method with curve search rule
- On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
- The convergence rate of a three-term HS method with restart strategy for unconstrained optimization problems
- Rate of Convergence of a Restarted CG-DESCENT Method
- Globally linearly convergent nonlinear conjugate gradients without Wolfe line search
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Adaptive trust-region method on Riemannian manifold
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- The convergence rate of a restart MFR conjugate gradient method with inexact line search
Recommendations
This page was built for publication: Convergence properties of the Beale-Powell restart algorithm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1286637)