Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
From MaRDI portal
Publication:3055065
DOI10.1090/S0025-5718-08-02031-0zbMath1198.65091WikidataQ59241589 ScholiaQ59241589MaRDI QIDQ3055065
Liqun Qi, Guoyin Li, Zeng-xin Wei
Publication date: 7 November 2010
Published in: Mathematics of Computation (Search for Journal in Brave)
90C26: Nonconvex programming, global optimization
65H10: Numerical computation of solutions to systems of equations
Related Items
An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, A self-adjusting spectral conjugate gradient method for large-scale unconstrained optimization, Global convergence of a nonlinear conjugate gradient method, Global convergence of a modified spectral conjugate gradient method, Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search, A nonmonotone supermemory gradient algorithm for unconstrained optimization, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation, Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, A new version of the Liu-Storey conjugate gradient method, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Efficient generalized conjugate gradient algorithms. I: Theory
- A globally convergent version of the Polak-Ribière conjugate gradient method
- An SQP-type method and its application in stochastic programs
- Minimization of functions having Lipschitz continuous first partial derivatives
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A modification of Armijo's step-size rule for negative curvature
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonmonotone Line Search Technique for Newton’s Method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Global convergence of conjugate gradient methods without line search