Worst case complexity of direct search under convexity
From MaRDI portal
Publication:5962720
DOI10.1007/s10107-014-0847-0zbMath1338.90462OpenAlexW1996491707WikidataQ58040489 ScholiaQ58040489MaRDI QIDQ5962720
M. Dodangeh, Luis Nunes Vicente
Publication date: 23 February 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-014-0847-0
convexityderivative-free optimizationdirect searchglobal ratesufficient decreaseworst case complexity
Convex programming (90C25) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (12)
On the optimal order of worst case complexity of direct search ⋮ Efficient unconstrained black box optimization ⋮ A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization ⋮ On the worst-case evaluation complexity of non-monotone line search algorithms ⋮ Worst-case evaluation complexity of a derivative-free quadratic regularization method ⋮ An indicator for the switch from derivative-free to derivative-based optimization ⋮ Worst-case complexity bounds of directional direct-search methods for multiobjective optimization ⋮ On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization ⋮ Stochastic Three Points Method for Unconstrained Smooth Minimization ⋮ Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case ⋮ Derivative-free optimization methods ⋮ Direct Search Based on Probabilistic Descent
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Worst case complexity of direct search
- Introductory lectures on convex optimization. A basic course.
- Random gradient-free minimization of convex functions
- Cubic regularization of Newton method and its global performance
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Introduction to Derivative-Free Optimization
- On the Local Convergence of Pattern Search
- CUTEr and SifDec
- Convex Analysis
This page was built for publication: Worst case complexity of direct search under convexity