Full-low evaluation methods for derivative-free optimization
From MaRDI portal
Publication:5882241
DOI10.1080/10556788.2022.2142582OpenAlexW3183503617MaRDI QIDQ5882241
O. Sohab, Albert S. Berahas, Luis Nunes Vicente
Publication date: 15 March 2023
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2107.11908
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- Analysis of direct searches for discontinuous functions
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- A derivative-free nonmonotone line-search technique for unconstrained optimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- Incorporating minimum Frobenius norm models in direct search
- Order-based error for managing ensembles of surrogates in mesh adaptive direct search
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Random gradient-free minimization of convex functions
- Geometry of interpolation sets in derivative free optimization
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Estimating Derivatives of Noisy Simulations
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Estimating Computational Noise
- On the Convergence of Pattern Search Algorithms
- Implicit Filtering
- Using Sampling and Simplex Derivatives in Pattern Search Methods
- Developments of NEWUOA for minimization without derivatives
- On the geometry phase in model-based algorithms for derivative-free optimization
- Introduction to Derivative-Free Optimization
- Optimization and nonsmooth analysis
- Multivariate stochastic approximation using a simultaneous perturbation gradient approximation
- Analysis of Generalized Pattern Searches
- Trust Region Methods
- Derivative-Free and Blackbox Optimization
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Use of quadratic models with mesh-adaptive direct search for constrained black box optimization
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Function Minimization by Interpolation in a Data Table
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Trust-Region Methods for the Derivative-Free Optimization of Nonsmooth Black-Box Functions
- Derivative-free optimization methods
- Derivative-Free Optimization of Expensive Functions with Computational Error Using Weighted Regression
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Direct Search Based on Probabilistic Descent
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Stochastic Estimation of the Maximum of a Regression Function
- Benchmarking optimization software with performance profiles.
- Two decades of blackbox optimization applications