A conjugate gradient method with sufficient descent property
From MaRDI portal
Publication:747726
DOI10.1007/s11075-014-9946-5zbMath1327.90317OpenAlexW2002303885MaRDI QIDQ747726
Xiaoyan Qian, Feng Rao, Hao Liu, Hai-Jun Wang
Publication date: 19 October 2015
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-014-9946-5
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Scaled conjugate gradient algorithms for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- A three-parameter family of nonlinear conjugate gradient methods
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Technical Note—A Modified Conjugate Gradient Algorithm
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Conjugate Gradient Methods with Inexact Searches
- Numerical Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: A conjugate gradient method with sufficient descent property