Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (Q849150)

From MaRDI portal
Revision as of 11:20, 2 July 2024 by ReferenceBot (talk | contribs) (‎Changed an Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
scientific article

    Statements

    Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (English)
    0 references
    0 references
    0 references
    24 February 2010
    0 references
    The authors consider unconstrained optimization problems with a continuously differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of modified conjugate gradient methods is proposed for solving the problems. The methods in this class have a common property that the direction \(d_k\) generated at iteration \(k\) and corresponding gradient \(g_k\) of function \(f\) satisfy the equality \(g_k^Tdk= -\| g_k\|^2\). Global convergence for modified methods \(YT\) and \(YT\)+, which belong to the proposed class of methods, is proved under suitable conditions. Extensive numerical experiments show the efficiency of the proposed methods. The numerical experiments are carried out using test problems from CUTE library.
    0 references
    conjugate gradient method
    0 references
    line search
    0 references
    global convergence
    0 references
    unconstrained optimization
    0 references
    0 references
    0 references
    0 references

    Identifiers