Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
DOI10.1051/RO/2016028zbMATH Open1358.49027OpenAlexW2335839371MaRDI QIDQ2969958FDOQ2969958
Authors: Weijun Li, Yubo He, Xiaoliang Dong
Publication date: 24 March 2017
Published in: RAIRO - Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1051/ro/2016028
Recommendations
- Modified Yabe-Takano nonlinear conjugate gradient method
- A modified conjugate gradient method with sufficient condition and conjugacy condition
- scientific article; zbMATH DE number 7732593
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A modified conjugate-descent method and its global convergence
- A modified conjugacy condition and related nonlinear conjugate gradient method
- Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Two modified three-term conjugate gradient methods with sufficient descent property
- A modified three-term conjugate gradient method with sufficient descent property
conjugacy conditionglobal convergencesufficient descent conditionnumerical comparisonYabe-Takano conjugate gradient method
Numerical mathematical programming methods (65K05) Numerical methods based on nonlinear programming (49M37) Methods of quasi-Newton type (90C53)
Cites Work
- CUTE
- Benchmarking optimization software with performance profiles.
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- New quasi-Newton methods for unconstrained optimization problems
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Comment on ``A new three-term conjugate gradient method for unconstrained problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- Title not available (Why is that?)
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Multi-step quasi-Newton methods for optimization
- Using function-values in multi-step quasi-Newton methods
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- Sufficient descent conjugate gradient methods for large-scale optimization problems
Cited In (3)
Uses Software
This page was built for publication: Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2969958)