A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems
From MaRDI portal
Publication:2987782
DOI10.1080/01630563.2016.1232730zbMath1362.90341OpenAlexW2518810377MaRDI QIDQ2987782
Zexian Liu, Yu Bo He, Hong-Wei Liu, Xiao Liang Dong, Xiang-Li Li
Publication date: 18 May 2017
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2016.1232730
Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items (2)
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: deterministic convergence and its application
Cites Work
- Unnamed Item
- Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- Convergence analysis of a modified BFGS method on convex minimizations
- Efficient generalized conjugate gradient algorithms. I: Theory
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems