The convergence of conjugate gradient method with nonmonotone line search
From MaRDI portal
Publication:606706
DOI10.1016/j.amc.2010.06.047zbMath1206.65166MaRDI QIDQ606706
Zhiwei Xu, Zhen-Jun Shi, Sheng-Quan Wang
Publication date: 18 November 2010
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2010.06.047
65K05: Numerical mathematical programming methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- Quadratic cost flow and the conjugate gradient method
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- Efficient generalized conjugate gradient algorithms. I: Theory
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Non-monotone trust-region algorithms for nonlinear optimization subject to convex constraints
- Analysis of monotone gradient methods
- Convergence of nonmonotone line search method
- Minimization of functions having Lipschitz continuous first partial derivatives
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application
- Convergence of descent method without line search
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- CUTE
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Global convergence of conjugate gradient methods without line search
- On the nonmonotone line search