Worst case complexity of direct search
From MaRDI portal
Publication:743632
DOI10.1007/S13675-012-0003-7zbMATH Open1304.90198OpenAlexW2041322191MaRDI QIDQ743632FDOQ743632
Authors: L. N. Vicente
Publication date: 30 September 2014
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s13675-012-0003-7
Recommendations
- Worst case complexity of direct search under convexity
- A generalized worst-case complexity analysis for non-monotone line searches
- On the optimal order of worst case complexity of direct search
- Worst-case complexity bounds of directional direct-search methods for multiobjective optimization
- A second-order globally convergent direct-search method and its worst-case complexity
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- Introductory lectures on convex optimization. A basic course.
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- On the Convergence of Pattern Search Algorithms
- Introduction to Derivative-Free Optimization
- Title not available (Why is that?)
- On the Local Convergence of Pattern Search
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- Pattern Search Methods for Linearly Constrained Minimization
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- On Nesterov's nonsmooth Chebyshev-Rosenbrock functions
- Analysis of direct searches for discontinuous functions
- Analysis of Generalized Pattern Searches
- Cubic regularization of Newton method and its global performance
- A recursive Formula-trust-region method for bound-constrained nonlinear optimization
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
Cited In (35)
- Complexity of gradient descent for multiobjective optimization
- An indicator for the switch from derivative-free to derivative-based optimization
- Stochastic trust-region and direct-search methods: a weak tail bound condition and reduced sample sizing
- A derivative-free comirror algorithm for convex optimization
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Derivative-free methods for mixed-integer constrained optimization problems
- On the optimal order of worst case complexity of direct search
- Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization
- Worst case complexity of direct search under convexity
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
- On the complexity of finding first-order critical points in constrained nonlinear optimization
- On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization
- Worst-case complexity bounds of directional direct-search methods for multiobjective optimization
- Stochastic three points method for unconstrained smooth minimization
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Worst-case evaluation complexity of a derivative-free quadratic regularization method
- Trust-region methods without using derivatives: worst case complexity and the nonsmooth case
- Direct search based on probabilistic descent
- Direct Search Based on Probabilistic Descent in Reduced Spaces
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- Quadratic regularization methods with finite-difference gradient approximations
- Worst case complexity bounds for linesearch-type derivative-free algorithms
- Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models
- Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization
- Retraction-based direct search methods for derivative free Riemannian optimization
- A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization
- Expected complexity analysis of stochastic direct-search
- Improving direct search algorithms by multilevel optimization techniques
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Complexity bounds for second-order optimality in unconstrained optimization
- A note about the complexity of minimizing Nesterov's smooth Chebyshev-Rosenbrock function
- Levenberg-Marquardt method based on probabilistic Jacobian models for nonlinear equations
- A second-order globally convergent direct-search method and its worst-case complexity
- Efficient unconstrained black box optimization
- Derivative-free optimization methods
This page was built for publication: Worst case complexity of direct search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q743632)