A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
From MaRDI portal
Publication:1720973
DOI10.1155/2018/4813030zbMath1427.90291OpenAlexW2810515883MaRDI QIDQ1720973
Yong Li, Gaoyi Wu, Gong Lin Yuan
Publication date: 8 February 2019
Published in: Mathematical Problems in Engineering (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2018/4813030
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items (3)
Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems ⋮ Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations ⋮ Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- A note about WYL's conjugate gradient method and its applications
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- On the rate of superlinear convergence of a class of variable metric methods
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- The global convergence of a modified BFGS method for nonconvex functions
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- Convergence Properties of Algorithms for Nonlinear Optimization
- Die Konvergenzordnung des Fletcher-Powell-Algorithmus
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Generalized trapezoidal formulas for parabolic equations
- Rate of Convergence of Several Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems