A discussion on variational analysis in derivative-free optimization
DOI10.1007/S11228-020-00556-YzbMATH Open1473.49019OpenAlexW3088043574MaRDI QIDQ829491FDOQ829491
Authors: Warren L. Hare
Publication date: 6 May 2021
Published in: Set-Valued and Variational Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11228-020-00556-y
Recommendations
- Derivative-free optimization via proximal point methods
- Derivative-free optimization methods for finite minimax problems
- Compositions of convex functions and fully linear models
- Arc-search in numerical optimization
- Numerical analysis of \(\mathcal{VU}\)-decomposition, \(\mathcal{U}\)-gradient, and \(\mathcal{U}\)-Hessian approximations
variational analysisderivative-free optimizationmodel-based methodsdirect-search methodorder-\(N\) accuracy
Numerical mathematical programming methods (65K05) Nonsmooth analysis (49J52) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- CONDOR, a new parallel, constrained extension of Powell's UOBYQA algorithm: Experimental results and comparison with the DFO algorithm
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- UOBYQA: unconstrained optimization by quadratic approximation
- ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization
- An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints
- Trust-region methods for the derivative-free optimization of nonsmooth black-box functions
- `` Direct Search Solution of Numerical and Statistical Problems
- A Simplex Method for Function Minimization
- A mesh adaptive direct search algorithm for multiobjective optimization
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- On the properties of positive spanning sets and positive bases
- On the Convergence of Pattern Search Algorithms
- Introduction to Derivative-Free Optimization
- Parallel radial basis function methods for the global optimization of expensive functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A proximal bundle method for nonsmooth nonconvex functions with inexact information
- Direct Multisearch for Multiobjective Optimization
- Stochastic optimization using a trust-region method and random models
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- The calculus of simplex gradients
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Reducing the number of function evaluations in mesh adaptive direct search algorithms
- Analysis of Generalized Pattern Searches
- Precision Control for Generalized Pattern Search Algorithms with Adaptive Precision Function Evaluations
- Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search
- Globalization strategies for mesh adaptive direct search
- ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions
- Surrogate Optimization of Computationally Expensive Black-Box Problems with Hidden Constraints
- On trust region methods for unconstrained minimization without derivatives
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Convergence results for generalized pattern search algorithms are tight
- The mesh adaptive direct search algorithm with treed Gaussian process surrogates
- Global convergence of radial basis function trust region derivative-free algorithms
- A stochastic line search method with expected complexity analysis
- Manifold sampling for \(\ell_1\) nonconvex optimization
- Derivative-free optimization via proximal point methods
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Efficient calculation of regular simplex gradients
- Derivative-free and blackbox optimization
- Optimization methods for large-scale machine learning
- Adaptive Interpolation Strategies in Derivative-Free Optimization: a case study
- Calculus identities for generalized simplex gradients: rules and applications
- Mesh-based Nelder-Mead algorithm for inequality constrained optimization
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- Compositions of convex functions and fully linear models
- On the construction of quadratic models for derivative-free trust-region algorithms
- Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
- GOSH: derivative-free global optimization using multi-dimensional space-filling curves
- Derivative-free robust optimization by outer approximations
- Monotonic grey box direct search optimization
- Algorithmic construction of the subdifferential from directional derivatives
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- Manifold sampling for optimization of nonconvex functions that are piecewise linear compositions of smooth components
- The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables
- A derivative-free \(\mathcal{V} \mathcal{U}\)-algorithm for convex finite-max problems
- Derivative-free optimization methods
Cited In (1)
Uses Software
This page was built for publication: A discussion on variational analysis in derivative-free optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q829491)