Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (Q849150): Difference between revisions
From MaRDI portal
Changed an Item |
Set profile property. |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 01:22, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Sufficient descent nonlinear conjugate gradient methods with conjugacy condition |
scientific article |
Statements
Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (English)
0 references
24 February 2010
0 references
The authors consider unconstrained optimization problems with a continuously differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of modified conjugate gradient methods is proposed for solving the problems. The methods in this class have a common property that the direction \(d_k\) generated at iteration \(k\) and corresponding gradient \(g_k\) of function \(f\) satisfy the equality \(g_k^Tdk= -\| g_k\|^2\). Global convergence for modified methods \(YT\) and \(YT\)+, which belong to the proposed class of methods, is proved under suitable conditions. Extensive numerical experiments show the efficiency of the proposed methods. The numerical experiments are carried out using test problems from CUTE library.
0 references
conjugate gradient method
0 references
line search
0 references
global convergence
0 references
unconstrained optimization
0 references