A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization
From MaRDI portal
(Redirected from Publication:747220)
Abstract: In this paper, an efficient modified Newton type algorithm is proposed for nonlinear unconstrianed optimization problems. The modified Hessian is a convex combination of the identity matrix (for steepest descent algorithm) and the Hessian matrix (for Newton algorithm). The coefficients of the convex combination are dynamically chosen in every iteration. The algorithm is proved to be globally and quadratically convergent for (convex and nonconvex) nonlinear functions. Efficient implementation is described. Numerical test on widely used CUTE test problems is conducted for the new algorithm. The test results are compared with those obtained by MATLAB optimization toolbox function { t fminunc}. The test results are also compared with those obtained by some established and state-of-the-art algorithms, such as a limited memory BFGS, a descent and conjugate gradient algorithm, and a limited memory and descent conjugate gradient algorithm. The comparisons show that the new algorithm is promising.
Recommendations
- scientific article; zbMATH DE number 4110473
- Global convergence of a modified BFGS-type method for unconstrained non-convex minimization
- An algorithm for unconstrained optimization
- Globally convergent algorithms for unconstrained optimization
- A nonmonotone modified BFGS algorithm for nonconvex unconstrained optimization problems
Cites work
- scientific article; zbMATH DE number 4213315 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 1206370 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A method for the solution of certain non-linear problems in least squares
- A survey of conjugate gradient algorithms for solution of extreme eigen-problems of a symmetric matrix
- CUTE
- Computing Modified Newton Directions Using a Partial Cholesky Factorization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Descentwise inexact proximal algorithms for smooth optimization
- Function minimization by conjugate gradients
- Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization
- In favor of conjugate directions: a generalized acceptable-point algorithm for function minimization
- Line search algorithms with guaranteed sufficient decrease
- Newton-type methods for unconstrained and linearly constrained optimization
- On conjugate gradient-like methods for eigen-like problems
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Self-adaptive inexact proximal point methods
- TACO: a toolkit for AMPL control optimization
- The Geometry of Algorithms with Orthogonality Constraints
- The Limited Memory Conjugate Gradient Method
- The Sequential Parameter Optimization Toolbox
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(6)- The spherical quadratic steepest descent (SQSD) method for unconstrained minimization with no explicit line searches
- A robust BFGS algorithm for unconstrained nonlinear optimization problems
- An Algorithm for Unconstrained Quadratically Penalized Convex Optimization
- A polynomial time infeasible interior-point arc-search algorithm for convex optimization
- scientific article; zbMATH DE number 4110473 (Why is no real title available?)
- UOBYQA: unconstrained optimization by quadratic approximation
This page was built for publication: A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q747220)