The convergence of conjugate gradient method with nonmonotone line search
From MaRDI portal
Publication:606706
DOI10.1016/j.amc.2010.06.047zbMath1206.65166OpenAlexW2014919593MaRDI QIDQ606706
Zhiwei Xu, Sheng-Quan Wang, Zhen-Jun Shi
Publication date: 18 November 2010
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2010.06.047
Related Items (10)
A memory gradient method based on the nonmonotone technique ⋮ A new class of nonlinear conjugate gradient coefficients with global convergence properties ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence ⋮ A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches ⋮ A new class of nonmonotone conjugate gradient training algorithms ⋮ A hybrid conjugate finite-step length method for robust and efficient reliability analysis ⋮ A nonmonotone supermemory gradient algorithm for unconstrained optimization ⋮ A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization ⋮ Nonconforming double set parameter finite element methods for a fourth order variational inequality with two-sided displacement obstacle
Uses Software
Cites Work
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- Quadratic cost flow and the conjugate gradient method
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- Efficient generalized conjugate gradient algorithms. I: Theory
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Non-monotone trust-region algorithms for nonlinear optimization subject to convex constraints
- Analysis of monotone gradient methods
- Convergence of nonmonotone line search method
- Minimization of functions having Lipschitz continuous first partial derivatives
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application
- Convergence of descent method without line search
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- CUTE
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Global convergence of conjugate gradient methods without line search
- On the nonmonotone line search
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: The convergence of conjugate gradient method with nonmonotone line search