Rate of Convergence of a Restarted CG-DESCENT Method
From MaRDI portal
Publication:2841919
DOI10.1080/01630563.2012.760590zbMath1282.90184OpenAlexW2093484431MaRDI QIDQ2841919
Publication date: 30 July 2013
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2012.760590
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- The convergence rate of a restart MFR conjugate gradient method with inexact line search
- On restart procedures for the conjugate gradient method
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- On the rate of superlinear convergence of a class of variable metric methods
- Convergence properties of the Beale-Powell restart algorithm
- Algorithm 851
- Some convergence properties of the conjugate gradient method
- Restart procedures for the conjugate gradient method
- Rate of Convergence of Several Conjugate Gradient Algorithms
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Linear Convergence of the Conjugate Gradient Method
- Benchmarking optimization software with performance profiles.
This page was built for publication: Rate of Convergence of a Restarted CG-DESCENT Method