Conjugate gradient algorithms in nonconvex optimization (Q947710)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Conjugate gradient algorithms in nonconvex optimization
scientific article

    Statements

    Conjugate gradient algorithms in nonconvex optimization (English)
    0 references
    0 references
    6 October 2008
    0 references
    CG-methods of shortest residuals developed by the author in his thesis are compared with several standard and non standard conjugate gradients methods, some quasi Newton methods included. It is shown, that Lemaréchal-Wolfe method for convex nonsmooth problems is equivalent to the Fletcher-Reeves CG method provided that an exact line search is used. All facets of CG-methods are widely considered from the theoretical and numerical point of view. Local and especially global convergence properties and strategies are studied. Included are standard methods for quadratic problems and for nonconvex smooth problems, memoryless quasi Newton methods, preconditioned CG algorithms, limited memory quasi Newton algorithms, shortest residual algorithms for smooth and convex nonsmooth problems without and with preconditioning and finally CG-methods for box constrained problems with and without preconditioning. The main results are derived in detail. Thus the reader can easily follow all used arguments. It is a very nice written book which can be used by researches in optimization, in the teaching for seminars and by students which are known the foundations of optimization. Lists of figures, tables and algorithms make this book to a useful compendium for research and teaching. A lot of bibliographical hints with respect to a large reference list make the reader known with the historical development of CG-methods and gives him the possibility to study special aspects more in detail by using original papers. The appendices with elements of topology, analysis, linear algebra and numerics of linear algebra make it to a self-contained book.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    conjugate gradient
    0 references
    method
    0 references
    local convergence
    0 references
    global convergence
    0 references
    smooth
    0 references
    nonsmooth
    0 references
    subgradient algorithm
    0 references
    memoryless
    0 references
    preconditioned
    0 references
    restart
    0 references
    limited memory
    0 references
    shortest residual
    0 references
    box constrained
    0 references
    reduced Hessian
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references