A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization (Q2297654): Difference between revisions
From MaRDI portal
ReferenceBot (talk | contribs) Changed an Item |
Normalize DOI. |
||
(One intermediate revision by one other user not shown) | |||
Property / DOI | |||
Property / DOI: 10.1007/s10107-019-01362-7 / rank | |||
Property / Wikidata QID | |||
Property / Wikidata QID: Q128532682 / rank | |||
Normal rank | |||
Property / DOI | |||
Property / DOI: 10.1007/S10107-019-01362-7 / rank | |||
Normal rank |
Latest revision as of 21:21, 17 December 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization |
scientific article |
Statements
A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization (English)
0 references
20 February 2020
0 references
The authors deal with unconstrained optimization problems where the objective function is in general nonconvex and twice Lipschitz continuously differentiable. The aim of the paper is to develop a method that tracks the Newton-CG procedures but includes some safeguards and enhancements allowing worst-case complexity results to be proved for the convergence to points satisfying approximate first- and second-order necessary optimality conditions. The algorithm has two major components. In the first part, a linear CG method is used to solve a slightly damped Newton system. This process terminates either with an approximation to the Newton step or with a direction of negative curvature. The second part is called a ``minimum eigenvalue oracle'' and returns either a direction of sufficient negative curvature in a given symmetric matrix or a certificate that the matrix is almost positive definite. After that, a backtracking line search is used to identify a new iteration satisfying a sufficient decrease condition. It is shown that significant decrease in the objective function is attained at reasonable computational cost, i.e., in terms of the number of gradient evaluations or Hessian-vector products. A global worst-case complexity analysis of the algorithm is presented. Complexity results (bounds on the number of Hessian-vector products and gradient evaluations) for the convergence to points satisfying first- and second-order optimality conditions are proved. Elements of this analysis follow those presented in the earlier paper [\textit{C. W. Royer} and \textit{S. J. Wright}, SIAM J. Optim. 28, No. 2, 1448--1477 (2018; Zbl 1391.49055)].
0 references
unconstrained optimization
0 references
smooth nonconvex optimization
0 references
Newton's method
0 references
conjugate gradient method
0 references
first-order optimality conditions
0 references
second-order optimality conditions
0 references
worst-case complexity
0 references
0 references
0 references
0 references
0 references
0 references