On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (Q483732)

From MaRDI portal
Revision as of 17:08, 29 February 2024 by SwMATHimport240215 (talk | contribs) (‎Changed an Item)
scientific article
Language Label Description Also known as
English
On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
scientific article

    Statements

    On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (English)
    0 references
    0 references
    17 December 2014
    0 references
    Conjugate gradient (CG) methods comprise a class of unconstrained optimization algorithms characterized by low memory requirements and strong global convergence properties. Although CG methods are not the fastest or most robust optimization algorithms available today, they remain very popular for engineers and mathematicians engaged in solving large-scale problems in the following form: \(\underset{x \in \mathbb R^n}\min f (x)\), where \(f : \mathbb R^n \to \mathbb R\) is a smooth nonlinear function and its gradient is available. Based on an eigenvalue study, the author establishes the sufficient descent condition of an extended class of the Hager-Zhang nonlinear conjugate gradient methods.
    0 references
    0 references
    unconstrained optimization
    0 references
    conjugate gradient algorithm
    0 references
    eigenvalue
    0 references
    sufficient descent condition
    0 references
    global convergence
    0 references
    large-scale problem
    0 references
    Hager-Zhang nonlinear conjugate gradient method
    0 references
    0 references
    0 references

    Identifiers