A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization

From MaRDI portal
Publication:747220

DOI10.1007/S40314-014-0172-5zbMATH Open1401.90177arXiv1212.5452OpenAlexW3098305860MaRDI QIDQ747220FDOQ747220


Authors: Ya-Guang Yang Edit this on Wikidata


Publication date: 23 October 2015

Published in: Computational and Applied Mathematics (Search for Journal in Brave)

Abstract: In this paper, an efficient modified Newton type algorithm is proposed for nonlinear unconstrianed optimization problems. The modified Hessian is a convex combination of the identity matrix (for steepest descent algorithm) and the Hessian matrix (for Newton algorithm). The coefficients of the convex combination are dynamically chosen in every iteration. The algorithm is proved to be globally and quadratically convergent for (convex and nonconvex) nonlinear functions. Efficient implementation is described. Numerical test on widely used CUTE test problems is conducted for the new algorithm. The test results are compared with those obtained by MATLAB optimization toolbox function { t fminunc}. The test results are also compared with those obtained by some established and state-of-the-art algorithms, such as a limited memory BFGS, a descent and conjugate gradient algorithm, and a limited memory and descent conjugate gradient algorithm. The comparisons show that the new algorithm is promising.


Full work available at URL: https://arxiv.org/abs/1212.5452




Recommendations




Cites Work


Cited In (6)

Uses Software





This page was built for publication: A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q747220)