Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
From MaRDI portal
Recommendations
- A modified conjugacy condition and related nonlinear conjugate gradient method
- A new modification of nonlinear conjugate gradient formula
- A modified hybrid conjugate gradient method for unconstrained optimization
- A modified conjugate gradient method with sufficient condition and conjugacy condition
- Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization
- A note about WYL's conjugate gradient method and its applications
- A three-parameter family of nonlinear conjugate gradient methods
- An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- An efficient modified Polak-Ribière-Polyak conjugate gradient method with global convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- An unconstrained optimization test functions collection
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Benchmarking optimization software with performance profiles.
- CUTE
- Conjugate gradient methods with Armijo-type line searches.
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Efficient generalized conjugate gradient algorithms. I: Theory
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Methods of conjugate gradients for solving linear systems
- Practical methods of optimization.
- The Limited Memory Conjugate Gradient Method
- The convergence properties of some new conjugate gradient methods
Cited in
(5)- A modification of classical conjugate gradient method using strong Wolfe line search
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- Modifications of the Wolfe line search rules to satisfy second-order optimality conditions in unconstrained optimization
- An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
This page was built for publication: Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1667567)