Convergence of conjugate gradient methods with constant stepsizes
From MaRDI portal
Publication:3096886
DOI10.1080/10556781003721042zbMath1227.49040MaRDI QIDQ3096886
Publication date: 15 November 2011
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556781003721042
unconstrained optimization; global convergence; conjugate gradient method; nonconvex; descent property; method of shortest residuals
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
49M37: Numerical methods based on nonlinear programming
Related Items
Method of conjugate subgradients with constrained memory, Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method, Computational efficiency of the simplex embedding method in convex nondifferentiable optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of the method of shortest residuals
- On the method of shortest residuals for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- On the convergence of conjugate gradient algorithms
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence properties of the Fletcher-Reeves method
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Global convergence of conjugate gradient methods without line search