A discussion on variational analysis in derivative-free optimization
From MaRDI portal
(Redirected from Publication:829491)
Recommendations
- Derivative-free optimization via proximal point methods
- Derivative-free optimization methods for finite minimax problems
- Compositions of convex functions and fully linear models
- Arc-search in numerical optimization
- Numerical analysis of \(\mathcal{VU}\)-decomposition, \(\mathcal{U}\)-gradient, and \(\mathcal{U}\)-Hessian approximations
Cites work
- scientific article; zbMATH DE number 653035 (Why is no real title available?)
- scientific article; zbMATH DE number 1971709 (Why is no real title available?)
- scientific article; zbMATH DE number 1552017 (Why is no real title available?)
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- A Simplex Method for Function Minimization
- A derivative-free \(\mathcal{V} \mathcal{U}\)-algorithm for convex finite-max problems
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- A mesh adaptive direct search algorithm for multiobjective optimization
- A proximal bundle method for nonsmooth nonconvex functions with inexact information
- A stochastic line search method with expected complexity analysis
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization
- Adaptive Interpolation Strategies in Derivative-Free Optimization: a case study
- Algorithmic construction of the subdifferential from directional derivatives
- An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints
- Analysis of Generalized Pattern Searches
- CONDOR, a new parallel, constrained extension of Powell's UOBYQA algorithm: Experimental results and comparison with the DFO algorithm
- Calculus identities for generalized simplex gradients: rules and applications
- Compositions of convex functions and fully linear models
- Convergence results for generalized pattern search algorithms are tight
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Derivative-free and blackbox optimization
- Derivative-free optimization methods
- Derivative-free optimization via proximal point methods
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Derivative-free robust optimization by outer approximations
- Direct Multisearch for Multiobjective Optimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- Efficient calculation of regular simplex gradients
- Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
- GOSH: derivative-free global optimization using multi-dimensional space-filling curves
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Global convergence of radial basis function trust region derivative-free algorithms
- Globalization strategies for mesh adaptive direct search
- Introduction to Derivative-Free Optimization
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Manifold sampling for \(\ell_1\) nonconvex optimization
- Manifold sampling for optimization of nonconvex functions that are piecewise linear compositions of smooth components
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Mesh-based Nelder-Mead algorithm for inequality constrained optimization
- Monotonic grey box direct search optimization
- Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search
- ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions
- On the Convergence of Pattern Search Algorithms
- On the construction of quadratic models for derivative-free trust-region algorithms
- On the properties of positive spanning sets and positive bases
- On trust region methods for unconstrained minimization without derivatives
- Optimization methods for large-scale machine learning
- Parallel radial basis function methods for the global optimization of expensive functions
- Precision Control for Generalized Pattern Search Algorithms with Adaptive Precision Function Evaluations
- Reducing the number of function evaluations in mesh adaptive direct search algorithms
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- Stochastic optimization using a trust-region method and random models
- Surrogate Optimization of Computationally Expensive Black-Box Problems with Hidden Constraints
- The mesh adaptive direct search algorithm for granular and discrete variables
- The calculus of simplex gradients
- The mesh adaptive direct search algorithm with treed Gaussian process surrogates
- Trust-region methods for the derivative-free optimization of nonsmooth black-box functions
- UOBYQA: unconstrained optimization by quadratic approximation
- `` Direct Search Solution of Numerical and Statistical Problems
Cited in
(1)
Describes a project that uses
Uses Software
This page was built for publication: A discussion on variational analysis in derivative-free optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q829491)