A new family of conjugate gradient methods (Q2519734)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A new family of conjugate gradient methods
scientific article

    Statements

    A new family of conjugate gradient methods (English)
    0 references
    0 references
    0 references
    27 January 2009
    0 references
    The conjugate gradient method is commonly used for solving large scale minimization problems due to its decreased storage requirements and simple computation. The global convergence of some conjugate gradient algorithms under certain line searches are not proved since the methods cannot guarantee the descent of objective function values at each iteration. This paper seeks some new line search approaches to overcome this drawback. The authors propose a new class of conjugate gradient methods for minimizing functions that have Lipschitz continuous partial derivatives. This new class contains the Polak-Ribière-Polyak and the Liu-Storey methods as its special cases. A new nonmonotone line search is proposed for guaranteeing the global convergence. By estimating the local Lipschitz constant of the derivative of objective functions, an adequate initial step size and a suitable step size can be chosen at each iteration which decreases the function evaluations and improves the efficiency of the conjugate gradient methods. Numerical results are reported to demonstrate the effectiveness of the proposed method.
    0 references
    0 references
    0 references
    0 references
    0 references
    unconstrained optimization
    0 references
    conjugate gradient method
    0 references
    global convergence
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references