Worst case complexity of direct search under convexity
DOI10.1007/S10107-014-0847-0zbMATH Open1338.90462OpenAlexW1996491707WikidataQ58040489 ScholiaQ58040489MaRDI QIDQ5962720FDOQ5962720
Publication date: 23 February 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-014-0847-0
Recommendations
- On the optimal order of worst case complexity of direct search
- Worst case complexity of direct search
- Worst-case complexity bounds of directional direct-search methods for multiobjective optimization
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- STACS 2005
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Complexity of multilinear problems in the worst case setting
- scientific article; zbMATH DE number 509206
- scientific article; zbMATH DE number 1512702
convexitydirect searchworst case complexityderivative-free optimizationglobal ratesufficient decrease
Convex programming (90C25) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- CUTEr and SifDec
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Convex Analysis
- Random gradient-free minimization of convex functions
- Efficiency of coordinate descent methods on huge-scale optimization problems
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Introduction to Derivative-Free Optimization
- Title not available (Why is that?)
- On the Local Convergence of Pattern Search
- Worst case complexity of direct search
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic regularization of Newton method and its global performance
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
Cited In (15)
- An indicator for the switch from derivative-free to derivative-based optimization
- Title not available (Why is that?)
- On the optimal order of worst case complexity of direct search
- Direct Search Based on Probabilistic Descent
- On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization
- Worst-case complexity bounds of directional direct-search methods for multiobjective optimization
- On the worst-case inefficiency of CGKA
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Worst-case evaluation complexity of a derivative-free quadratic regularization method
- Trust-region methods without using derivatives: worst case complexity and the nonsmooth case
- Stochastic zeroth order descent with structured directions
- A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization
- Efficient unconstrained black box optimization
- Derivative-free optimization methods
- Stochastic Three Points Method for Unconstrained Smooth Minimization
Uses Software
This page was built for publication: Worst case complexity of direct search under convexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5962720)