Worst case complexity of direct search
From MaRDI portal
Publication:743632
DOI10.1007/S13675-012-0003-7zbMath1304.90198OpenAlexW2041322191MaRDI QIDQ743632
Publication date: 30 September 2014
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s13675-012-0003-7
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- Unnamed Item
- Analysis of direct searches for discontinuous functions
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- On Nesterov's nonsmooth Chebyshev-Rosenbrock functions
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- On the Convergence of Pattern Search Algorithms
- A recursive Formula-trust-region method for bound-constrained nonlinear optimization
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Introduction to Derivative-Free Optimization
- Analysis of Generalized Pattern Searches
- On the Local Convergence of Pattern Search
- Pattern Search Methods for Linearly Constrained Minimization
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
Related Items (35)
On the optimal order of worst case complexity of direct search ⋮ A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function ⋮ Efficient unconstrained black box optimization ⋮ A derivative-free comirror algorithm for convex optimization ⋮ Improving Direct Search algorithms by multilevel optimization techniques ⋮ A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization ⋮ Scalable subspace methods for derivative-free nonlinear least-squares optimization ⋮ Direct search based on probabilistic feasible descent for bound and linearly constrained problems ⋮ Direct Search Based on Probabilistic Descent in Reduced Spaces ⋮ On the worst-case evaluation complexity of non-monotone line search algorithms ⋮ Quadratic regularization methods with finite-difference gradient approximations ⋮ Worst-case evaluation complexity of a derivative-free quadratic regularization method ⋮ Complexity bounds for second-order optimality in unconstrained optimization ⋮ On the complexity of finding first-order critical points in constrained nonlinear optimization ⋮ An indicator for the switch from derivative-free to derivative-based optimization ⋮ Worst case complexity bounds for linesearch-type derivative-free algorithms ⋮ Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models ⋮ Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization ⋮ Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization ⋮ Worst-case complexity bounds of directional direct-search methods for multiobjective optimization ⋮ Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization ⋮ Complexity of gradient descent for multiobjective optimization ⋮ Worst case complexity of direct search under convexity ⋮ Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models ⋮ Stochastic trust-region and direct-search methods: a weak tail bound condition and reduced sample sizing ⋮ Retraction-based direct search methods for derivative free Riemannian optimization ⋮ A second-order globally convergent direct-search method and its worst-case complexity ⋮ On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization ⋮ Stochastic Three Points Method for Unconstrained Smooth Minimization ⋮ Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case ⋮ Derivative-free optimization methods ⋮ Expected complexity analysis of stochastic direct-search ⋮ Levenberg-Marquardt method based on probabilistic Jacobian models for nonlinear equations ⋮ Direct Search Based on Probabilistic Descent ⋮ Derivative-free methods for mixed-integer constrained optimization problems
This page was built for publication: Worst case complexity of direct search