Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
From MaRDI portal
Publication:5217594
DOI10.1137/17M1144854zbMath1437.90128arXiv1811.01220OpenAlexW2962948705MaRDI QIDQ5217594
Coralia Cartis, Nicholas I. M. Gould, Phillipe L. Toint
Publication date: 25 February 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.01220
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Abstract computational complexity for mathematical programming problems (90C60) Nonconvex programming, global optimization (90C26) Numerical methods based on nonlinear programming (49M37)
Related Items
A nonlinear conjugate gradient method using inexact first-order information, An adaptive regularization method in Banach spaces, Convergence Properties of an Objective-Function-Free Optimization Regularization Algorithm, Including an \(\boldsymbol{\mathcal{O}(\epsilon^{-3/2})}\) Complexity Bound, Super-Universal Regularized Newton Method, Trust-region algorithms: probabilistic complexity and intrinsic noise with applications to subsampling techniques, Approximating Higher-Order Derivative Tensors Using Secant Updates, The evaluation complexity of finding high-order minimizers of nonconvex optimization, A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization, The impact of noise on evaluation complexity: the deterministic trust-region case, An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity, High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms, A concise second-order complexity analysis for unconstrained optimization using high-order regularized models, Adaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivatives, Derivative-free optimization methods, An adaptive high order method for finding third-order critical points of nonconvex optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization
Cites Work
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Introductory lectures on convex optimization. A basic course.
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization
- Cubic regularization of Newton method and its global performance
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Trust Region Methods
- Accelerated Methods for NonConvex Optimization
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Black-Box Complexity of Local Minimization
- The Best Rank-1 Approximation of a Symmetric Tensor and Related Spherical Optimization Problems
- WORST-CASE EVALUATION COMPLEXITY AND OPTIMALITY OF SECOND-ORDER METHODS FOR NONCONVEX SMOOTH OPTIMIZATION
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians