A globally convergent version of the Polak-Ribière conjugate gradient method
From MaRDI portal
Publication:1366426
DOI10.1007/BF02614362zbMATH Open0887.90157MaRDI QIDQ1366426FDOQ1366426
Publication date: 10 September 1997
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
global convergenceline searchunconstrained minimizationnonconvex differentiable functionsPolak-Ribière conjugate gradient method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- On the Convergence of a New Conjugate Gradient Algorithm
- Methods of conjugate gradients for solving linear systems
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- Conjugate Gradient Methods with Inexact Searches
- Global convergence and stabilization of unconstrained minimization methods without derivatives
- Stopping criteria for linesearch methods without derivatives
- Generalized Polak-Ribière algorithm
- Globally convergent conjugate gradient algorithms
Cited In (92)
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- Convergence of PRP method with new nonmonotone line search
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- Exploiting damped techniques for nonlinear conjugate gradient methods
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
- Global convergence of a nonlinear conjugate gradient method
- A conjugate gradient method with descent direction for unconstrained optimization
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- The convergence of conjugate gradient method with nonmonotone line search
- A class of line search-type methods for nonsmooth convex regularized minimization
- The convergence properties of some new conjugate gradient methods
- A modified PRP conjugate gradient method
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- Globally convergent modified Perry's conjugate gradient method
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Convergence properties of the dependent PRP conjugate gradient methods
- A new class of supermemory gradient methods
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Convergence of descent method without line search
- Convergence properties of a correlation Polak-Ribiére conjugate gradient method
- A note about WYL's conjugate gradient method and its applications
- A PRP-based residual method for large-scale monotone nonlinear equations
- A gradient-related algorithm with inexact line searches
- A new super-memory gradient method with curve search rule
- A new family of conjugate gradient methods
- A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- Two new conjugate gradient methods based on modified secant equations
- Conjugate gradient methods with Armijo-type line searches.
- A new descent algorithm with curve search rule
- New step lengths in conjugate gradient methods
- A simple sufficient descent method for unconstrained optimization
- A spectral conjugate gradient method for nonlinear inverse problems
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Some sufficient descent conjugate gradient methods and their global convergence
- A practical PR+ conjugate gradient method only using gradient
- Convergence properties of a class of nonlinear conjugate gradient methods
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A new variant of the memory gradient method for unconstrained optimization
- A three-parameter family of nonlinear conjugate gradient methods
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method
- A short note on the global convergence of the unmodified PRP method
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- Global convergence of a modified spectral conjugate gradient method
- A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization
- On memory gradient method with trust region for unconstrained optimization
- Convergence of Liu-Storey conjugate gradient method
- New spectral PRP conjugate gradient method for unconstrained optimization
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- A modified three-term PRP conjugate gradient algorithm for optimization models
- Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- Title not available (Why is that?)
- A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems
- The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- A derivative-free conjugate residual method using secant condition for general large-scale nonlinear equations
- A spectral KRMI conjugate gradient method under the strong-Wolfe line search
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- Memory gradient method with Goldstein line search
- A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery
- Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- Convergence of supermemory gradient method
- A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization
- An adaptive projection BFGS method for nonconvex unconstrained optimization problems
- A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression
- A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems
- A note on the global convergence of the quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Global convergence of Polak-Ribière conjugate gradient method
- A sufficient descent nonlinear conjugate gradient method for solving \(\mathcal{M} \)-tensor equations
- Global convergence of a modified conjugate gradient method
- A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection
- A hybrid of DL and WYL nonlinear conjugate gradient methods
- The projection technique for two open problems of unconstrained optimization problems
- Globally convergent diagonal Polak-Ribière-Polyak like algorithm for nonlinear equations
- AADIS: an atomistic analyzer for dislocation character and distribution
- A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems
- Sufficient descent Polak-Ribière-Polyak conjugate gradient algorithm for large-scale box-constrained optimization
- A modified conjugacy condition and related nonlinear conjugate gradient method
This page was built for publication: A globally convergent version of the Polak-Ribière conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1366426)