A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
From MaRDI portal
Publication:2338474
DOI10.3934/jimo.2018149zbMath1438.90326OpenAlexW2900034036MaRDI QIDQ2338474
Publication date: 21 November 2019
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2018149
global convergencenonlinear conjugate gradient methodsufficient descent propertyPolak-Ribière-Polyak methodmemoryless BFGS method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
A three-term CGPM-based algorithm without Lipschitz continuity for constrained nonlinear monotone equations with applications ⋮ Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems ⋮ A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection ⋮ A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems ⋮ A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ A new family of hybrid three-term conjugate gradient methods with applications in image restoration ⋮ A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems ⋮ A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery ⋮ A modified inertial three-term conjugate gradient projection method for constrained nonlinear equations with applications in compressed sensing ⋮ Unnamed Item ⋮ A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update ⋮ Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- The global convergence of a descent PRP conjugate gradient method
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of a modified BFGS method for nonconvex functions
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method