On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization

From MaRDI portal
Publication:2902870

DOI10.1137/100812276zbMath1250.90083OpenAlexW2043674215WikidataQ58185697 ScholiaQ58185697MaRDI QIDQ2902870

Nicholas I. M. Gould, Coralia Cartis, Phillipe L. Toint

Publication date: 22 August 2012

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: http://purl.org/net/epubs/work/53954



Related Items

Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy, A cubic regularization of Newton's method with finite difference Hessian approximations, A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function, Global convergence rate analysis of unconstrained optimization methods based on probabilistic models, Cubic overestimation and secant updating for unconstrained optimization ofC2, 1functions, A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization, An adaptive regularization method in Banach spaces, Direct Search Based on Probabilistic Descent in Reduced Spaces, Quadratic regularization methods with finite-difference gradient approximations, Inexact accelerated high-order proximal-point methods, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Complexity bounds for second-order optimality in unconstrained optimization, On the complexity of finding first-order critical points in constrained nonlinear optimization, Projected adaptive cubic regularization algorithm with derivative-free filter technique for box constrained optimization, A derivative-free trust-region algorithm for composite nonsmooth optimization, Derivative-free restrictively preconditioned conjugate gradient path method without line search technique for solving linear equality constrained optimization, Worst-case complexity bounds of directional direct-search methods for multiobjective optimization, Worst case complexity of direct search under convexity, Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models, A second-order globally convergent direct-search method and its worst-case complexity, Worst case complexity of direct search, On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization, Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case, Derivative-free optimization methods, Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization, An interior affine scaling cubic regularization algorithm for derivative-free optimization subject to bound constraints


Uses Software