Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
From MaRDI portal
Publication:2969958
DOI10.1051/ro/2016028zbMath1358.49027OpenAlexW2335839371MaRDI QIDQ2969958
Yu Bo He, Weijun Li, Xiao Liang Dong
Publication date: 24 March 2017
Published in: RAIRO - Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1051/ro/2016028
global convergenceconjugacy conditionsufficient descent conditionnumerical comparisonYabe-Takano conjugate gradient method
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- Comment on ``A new three-term conjugate gradient method for unconstrained problem
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Using function-values in multi-step quasi-Newton methods
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- New quasi-Newton methods for unconstrained optimization problems
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Sufficient descent conjugate gradient methods for large-scale optimization problems
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.