A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
unconstrained optimizationsecond-order optimality conditionsconjugate gradient methodNewton's methodworst-case complexityfirst-order optimality conditionssmooth nonconvex optimization
Numerical optimization and variational techniques (65K10) Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53) Iterative numerical methods for linear systems (65F10) Abstract computational complexity for mathematical programming problems (90C60)
- A log-barrier Newton-CG method for bound constrained optimization with complexity guarantees
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization
- Linesearch Newton-CG methods for convex optimization with noise
- A Newton-CG Based Augmented Lagrangian Method for Finding a Second-Order Stationary Point of Nonconvex Equality Constrained Optimization with Complexity Guarantees
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Accelerated methods for nonconvex optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- An inexact regularized Newton framework with a worst-case iteration complexity of \(\mathscr{O}(\varepsilon^{-3/2})\) for nonconvex optimization
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Complexity bounds for second-order optimality in unconstrained optimization
- Convex optimization: algorithms and complexity
- Cubic regularization of Newton method and its global performance
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Evaluating Derivatives
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- Finding approximate local minima faster than gradient descent
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- The Conjugate Gradient Method and Trust Regions in Large Scale Optimization
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
- Truncated-Newton algorithms for large-scale unconstrained optimization
- Trust Region Methods
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization
- Convergence of Newton-MR under inexact Hessian information
- First-Order Methods for Nonconvex Quadratic Minimization
- Inexact derivative-free optimization for bilevel learning
- A Newton-CG Based Barrier Method for Finding a Second-Order Stationary Point of Nonconvex Conic Optimization with Complexity Guarantees
- Adaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivatives
- A Newton-CG Based Augmented Lagrangian Method for Finding a Second-Order Stationary Point of Nonconvex Equality Constrained Optimization with Complexity Guarantees
- Complexity of a projected Newton-CG method for optimization with bounds
- A truncated three-term conjugate gradient method with complexity guarantees with applications to nonconvex regression problem
- A log-barrier Newton-CG method for bound constrained optimization with complexity guarantees
- Linesearch Newton-CG methods for convex optimization with noise
- Nonlinear conjugate gradient for smooth convex functions
- A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization
- An efficient hybrid conjugate gradient method with an adaptive strategy and applications in image restoration problems
- Parameter-free accelerated gradient descent for nonconvex minimization
- A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression
- Inexact Newton-CG algorithms with complexity guarantees
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- A hybrid inexact regularized Newton and negative curvature method
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- MINRES: from negative curvature detection to monotonicity properties
- Computing second-order points under equality constraints: revisiting Fletcher's augmented Lagrangian
- scientific article; zbMATH DE number 7366720 (Why is no real title available?)
- Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
- Worst-Case Complexity of TRACE with Inexact Subproblem Solutions for Nonconvex Smooth Optimization
- Complexity analysis of interior-point methods for second-order stationary points of nonlinear semidefinite optimization problems
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
This page was built for publication: A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2297654)