A modified PRP conjugate gradient method
From MaRDI portal
Recommendations
- scientific article; zbMATH DE number 5926152
- Global convergence of a modified PRP conjugate gradient method
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- Convergence of the modified PRP conjugate gradient method following a new line search
- Global convergence of a modified PRP conjugate gradient method and its numerical results
- A modified three-term PRP conjugate gradient algorithm for optimization models
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Another modified DPRP conjugate gradient method with global convergent properties
- A practical PR+ conjugate gradient method only using gradient
Cites work
- scientific article; zbMATH DE number 992790 (Why is no real title available?)
- scientific article; zbMATH DE number 1004164 (Why is no real title available?)
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 4141383 (Why is no real title available?)
- scientific article; zbMATH DE number 88930 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 1276050 (Why is no real title available?)
- scientific article; zbMATH DE number 2063453 (Why is no real title available?)
- scientific article; zbMATH DE number 871316 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonmonotone Line Search Technique for Newton’s Method
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A modification of Armijo's step-size rule for negative curvature
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Convergence Properties of Algorithms for Nonlinear Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence of line search methods for unconstrained optimization
- Convergence of nonlinear conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Global convergence of conjugate gradient methods without line search
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Methods of conjugate gradients for solving linear systems
- Minimization of functions having Lipschitz continuous first partial derivatives
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Optimization. Algorithms and consistent approximations
- Restart procedures for the conjugate gradient method
- Testing Unconstrained Optimization Software
- The conjugate gradient method in extremal problems
- The convergence properties of some new conjugate gradient methods
- Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
Cited in
(42)- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- A conjugate gradient method with descent direction for unconstrained optimization
- A trust region algorithm with conjugate gradient technique for optimization problems
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A new adaptive trust region algorithm for optimization problems
- Another modified version of RMIL conjugate gradient method
- BFGS trust-region method for symmetric nonlinear equations
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- An active set limited memory BFGS algorithm for bound constrained optimization
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations
- A conjugate gradient algorithm and its applications in image restoration
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- An infeasible interior-point technique to generate the nondominated set for multiobjective optimization problems
- A modified PRP projection method for nonlinear equations with convex constraints
- A modified PRP conjugate gradient method for unconstrained optimization and nonlinear equations
- A conjugate gradient method for unconstrained optimization problems
- A modified Hestenes-Stiefel conjugate gradient algorithm for large-scale optimization
- A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization
- A New Method with Descent Property for Symmetric Nonlinear Equations
- A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- A projected PRP method for optimization with convex constraint
- A modified DPRP conjugate gradient method for unconstrained optimization
- The Hager-Zhang conjugate gradient algorithm for large-scale nonlinear equations
- Global convergence of a modified PRP conjugate gradient method
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- The projection technique for two open problems of unconstrained optimization problems
- A quasi-Newton algorithm for large-scale nonlinear equations
- A BFGS algorithm for solving symmetric nonlinear equations
- A modified three-term PRP conjugate gradient algorithm for optimization models
This page was built for publication: A modified PRP conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1026553)