Global convergence of a two-parameter family of conjugate gradient methods without line search
From MaRDI portal
Publication:697544
DOI10.1016/S0377-0427(02)00416-8zbMath1018.65081MaRDI QIDQ697544
Publication date: 17 September 2002
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
unconstrained optimizationglobal convergencenumerical experimentsconjugate gradient methodline searchlarge-scale problems
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items
An ODE-based trust region method for unconstrained optimization problems, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search, Unnamed Item, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, The convergence of conjugate gradient method with nonmonotone line search, Convergence of Liu-Storey conjugate gradient method, GLOBAL CONVERGENCE OF A SPECIAL CASE OF THE DAI–YUAN FAMILY WITHOUT LINE SEARCH, Convergence of conjugate gradient methods with a closed-form stepsize formula, Some remarks on conjugate gradient methods without line search, Global convergence of a memory gradient method without line search, CONVERGENCE PROPERTY AND MODIFICATIONS OF A MEMORY GRADIENT METHOD, GLOBAL CONVERGENCE OF SHORTEST-RESIDUAL FAMILY OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH, GLOBAL CONVERGENCE OF TWO KINDS OF THREE-TERM CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH, A new hybrid method for nonlinear complementarity problems, Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems, A note about WYL's conjugate gradient method and its applications, A nonmonotone conic trust region method based on line search for solving unconstrained optimization, A new family of conjugate gradient methods, New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, A modified PRP conjugate gradient method, A descent algorithm without line search for unconstrained optimization, Convergence of descent method without line search, Adaptive trust-region algorithms for unconstrained optimization, A nonmonotone trust region method for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Efficient generalized conjugate gradient algorithms. I: Theory
- A three-parameter family of nonlinear conjugate gradient methods
- Testing Unconstrained Optimization Software
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Global convergence of conjugate gradient methods without line search