Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
From MaRDI portal
Publication:5312756
DOI10.1080/1055678042000208570zbMath1087.90086OpenAlexW2067448198MaRDI QIDQ5312756
Publication date: 25 August 2005
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/1055678042000208570
Related Items (12)
Convergence of the Polak-Ribiére-Polyak conjugate gradient method ⋮ Strong global convergence of an adaptive nonmonotone memory gradient method ⋮ Convergence of Liu-Storey conjugate gradient method ⋮ Nonmonotone adaptive trust region method ⋮ A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression ⋮ A modified conjugacy condition and related nonlinear conjugate gradient method ⋮ Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method ⋮ Exploiting damped techniques for nonlinear conjugate gradient methods ⋮ An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems ⋮ A novel method of dynamic force identification and its application ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
Cites Work
- Efficient generalized conjugate gradient algorithms. I: Theory
- Global convergence and stabilization of unconstrained minimization methods without derivatives
- Generalized Polak-Ribière algorithm
- Conjugate gradient methods with Armijo-type line searches.
- Convergence Properties of Algorithms for Nonlinear Optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Stopping criteria for linesearch methods without derivatives
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Globally convergent conjugate gradient algorithms
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Function minimization by conjugate gradients
This page was built for publication: Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method