Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
From MaRDI portal
Publication:499680
DOI10.1007/s11590-014-0836-5zbMath1332.90344OpenAlexW2068555606MaRDI QIDQ499680
Ximei Yang, Xiao Liang Dong, Yin Ling Xu, Hong-Wei Liu
Publication date: 6 October 2015
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-014-0836-5
Related Items (10)
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Two diagonal conjugate gradient like methods for unconstrained optimization ⋮ Adaptive type-2 neural fuzzy sliding mode control of a class of nonlinear systems ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems ⋮ New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction ⋮ A diagonal PRP-type projection method for convex constrained nonlinear monotone equations ⋮ A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization
- Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence