Two new conjugate gradient methods for unconstrained optimization (Q2179153)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Two new conjugate gradient methods for unconstrained optimization
scientific article

    Statements

    Two new conjugate gradient methods for unconstrained optimization (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    12 May 2020
    0 references
    Summary: The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references