A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization (Q2297654): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Normalize DOI.
 
(6 intermediate revisions by 6 users not shown)
Property / DOI
 
Property / DOI: 10.1007/s10107-019-01362-7 / rank
Normal rank
 
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2963900797 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1803.02924 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Finding approximate local minima faster than gradient descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2809807 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated Methods for NonConvex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results / rank
 
Normal rank
Property / cites work
 
Property / cites work: Complexity bounds for second-order optimality in unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Trust Region Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Truncated-Newton algorithms for large-scale unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exploiting negative curvature directions in linesearch methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Evaluating Derivatives / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start / rank
 
Normal rank
Property / cites work
 
Property / cites work: Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Cubic regularization of Newton method and its global performance / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5491447 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Conjugate Gradient Method and Trust Regions in Large Scale Optimization / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q128532682 / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1007/S10107-019-01362-7 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 21:21, 17 December 2024

scientific article
Language Label Description Also known as
English
A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
scientific article

    Statements

    A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization (English)
    0 references
    0 references
    20 February 2020
    0 references
    The authors deal with unconstrained optimization problems where the objective function is in general nonconvex and twice Lipschitz continuously differentiable. The aim of the paper is to develop a method that tracks the Newton-CG procedures but includes some safeguards and enhancements allowing worst-case complexity results to be proved for the convergence to points satisfying approximate first- and second-order necessary optimality conditions. The algorithm has two major components. In the first part, a linear CG method is used to solve a slightly damped Newton system. This process terminates either with an approximation to the Newton step or with a direction of negative curvature. The second part is called a ``minimum eigenvalue oracle'' and returns either a direction of sufficient negative curvature in a given symmetric matrix or a certificate that the matrix is almost positive definite. After that, a backtracking line search is used to identify a new iteration satisfying a sufficient decrease condition. It is shown that significant decrease in the objective function is attained at reasonable computational cost, i.e., in terms of the number of gradient evaluations or Hessian-vector products. A global worst-case complexity analysis of the algorithm is presented. Complexity results (bounds on the number of Hessian-vector products and gradient evaluations) for the convergence to points satisfying first- and second-order optimality conditions are proved. Elements of this analysis follow those presented in the earlier paper [\textit{C. W. Royer} and \textit{S. J. Wright}, SIAM J. Optim. 28, No. 2, 1448--1477 (2018; Zbl 1391.49055)].
    0 references
    unconstrained optimization
    0 references
    smooth nonconvex optimization
    0 references
    Newton's method
    0 references
    conjugate gradient method
    0 references
    first-order optimality conditions
    0 references
    second-order optimality conditions
    0 references
    worst-case complexity
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references