A modified CG-DESCENT method for unconstrained optimization (Q535462): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Changed an Item
Changed an Item
Property / describes a project that uses
 
Property / describes a project that uses: CUTEr / rank
 
Normal rank

Revision as of 16:57, 29 February 2024

scientific article
Language Label Description Also known as
English
A modified CG-DESCENT method for unconstrained optimization
scientific article

    Statements

    A modified CG-DESCENT method for unconstrained optimization (English)
    0 references
    11 May 2011
    0 references
    This paper presents a modification of the CG-DESCENT proposed by Hager and Zhang yielding the estimate \(g_k^Td_k = -\|g_k\|^2\) instead of \(g_k^Td_k \leq -7\|g_k\|^2/8\) as in the original paper. The authors prove global convergence of their algorithm for strongly convex functions when using a line search parameter such that the Wolfe conditions are fulfilled. Furthermore, global convergence is shown for a general nonlinear function when the line search parameter fulfils the strong Wolfe conditions. The section on numerical results compares the performance of the modified CG-DESCENT method with the original one and two variants of the Polak-Ribiere-Polyak approach for 73 test cases.
    0 references
    unconstrainted optimization
    0 references
    conjugate gradient method, global convergence
    0 references
    CG-DESCENT
    0 references
    0 references
    0 references

    Identifiers