On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (Q483732): Difference between revisions

From MaRDI portal
ReferenceBot (talk | contribs)
Changed an Item
Import241208061232 (talk | contribs)
Normalize DOI.
 
(One intermediate revision by one other user not shown)
Property / DOI
 
Property / DOI: 10.1007/s10288-014-0255-6 / rank
Normal rank
 
Property / DOI
 
Property / DOI: 10.1007/S10288-014-0255-6 / rank
 
Normal rank

Latest revision as of 18:55, 9 December 2024

scientific article
Language Label Description Also known as
English
On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
scientific article

    Statements

    On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (English)
    0 references
    0 references
    17 December 2014
    0 references
    Conjugate gradient (CG) methods comprise a class of unconstrained optimization algorithms characterized by low memory requirements and strong global convergence properties. Although CG methods are not the fastest or most robust optimization algorithms available today, they remain very popular for engineers and mathematicians engaged in solving large-scale problems in the following form: \(\underset{x \in \mathbb R^n}\min f (x)\), where \(f : \mathbb R^n \to \mathbb R\) is a smooth nonlinear function and its gradient is available. Based on an eigenvalue study, the author establishes the sufficient descent condition of an extended class of the Hager-Zhang nonlinear conjugate gradient methods.
    0 references
    0 references
    unconstrained optimization
    0 references
    conjugate gradient algorithm
    0 references
    eigenvalue
    0 references
    sufficient descent condition
    0 references
    global convergence
    0 references
    large-scale problem
    0 references
    Hager-Zhang nonlinear conjugate gradient method
    0 references
    0 references
    0 references

    Identifiers