A modified PRP conjugate gradient method
From MaRDI portal
Publication:1026553
DOI10.1007/s10479-008-0420-4zbMath1163.90798OpenAlexW1975335027MaRDI QIDQ1026553
Publication date: 25 June 2009
Published in: Annals of Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10479-008-0420-4
unconstrained optimizationglobal convergenceconjugate gradient methodline searchR-linear convergence
Related Items
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations, A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations, A new adaptive trust region algorithm for optimization problems, A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, The projection technique for two open problems of unconstrained optimization problems, An infeasible interior-point technique to generate the nondominated set for multiobjective optimization problems, Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization, An active set limited memory BFGS algorithm for bound constrained optimization, A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems, A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations, Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method, A quasi-Newton algorithm for large-scale nonlinear equations, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, A BFGS algorithm for solving symmetric nonlinear equations, A modified three-term PRP conjugate gradient algorithm for optimization models, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, A conjugate gradient method for unconstrained optimization problems, A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, A New Method with Descent Property for Symmetric Nonlinear Equations, A conjugate gradient method with descent direction for unconstrained optimization, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems, A conjugate gradient algorithm and its applications in image restoration, BFGS trust-region method for symmetric nonlinear equations, The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique, A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration, The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A hybrid conjugate gradient method with descent property for unconstrained optimization, A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions, A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- Optimization. Algorithms and consistent approximations
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Convergence of line search methods for unconstrained optimization
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A modification of Armijo's step-size rule for negative curvature
- Restart procedures for the conjugate gradient method
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonmonotone Line Search Technique for Newton’s Method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Global convergence of conjugate gradient methods without line search