Improving the flexibility and robustness of model-based derivative-free optimization solvers
DOI10.1145/3338517zbMATH Open1486.65064arXiv1804.00154OpenAlexW2968397096WikidataQ113309988 ScholiaQ113309988MaRDI QIDQ4960952FDOQ4960952
Authors: Coralia Cartis, Jan Fiala, Benjamin Marteau, Lindon Roberts
Publication date: 24 April 2020
Published in: ACM Transactions on Mathematical Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1804.00154
Recommendations
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- A derivative-free Gauss-Newton method
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
least-squaresstochastic optimizationperformance evaluationmathematical softwarederivative-free optimizationtrust region methods
Cites Work
- Testing Unconstrained Optimization Software
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization
- BFO, a trainable derivative-free brute force optimizer for nonlinear bound-constrained optimization and equilibrium computations with continuous and discrete variables
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Trust Region Methods
- Introduction to Derivative-Free Optimization
- Title not available (Why is that?)
- Benchmarking Derivative-Free Optimization Algorithms
- Global optimization. Theory, algorithms, and applications
- Direct Multisearch for Multiobjective Optimization
- Stochastic optimization using a trust-region method and random models
- A derivative-free algorithm for least-squares minimization
- Derivative-free optimization of expensive functions with computational error using weighted regression
- The calculus of simplex gradients
- Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition
- On trust region methods for unconstrained minimization without derivatives
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- Variable-number sample-path optimization
- Robust preconditioners for the matrix free truncated Newton method.
- Methods to compare expensive stochastic optimization algorithms with random restarts
- Non-intrusive termination of noisy optimization
- Derivative-free and blackbox optimization
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- Best practices for comparing optimization algorithms
- Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms
- A derivative-free Gauss-Newton method
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
Cited In (26)
- Model-based derivative-free methods for convex-constrained optimization
- Escaping local minima with local derivative-free methods: a numerical investigation
- A stochastic Levenberg-Marquardt method using random models with complexity results
- Arbitrage-Free Neural-SDE Market Models
- Exploiting Problem Structure in Derivative Free Optimization
- Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
- Decomposition in derivative-free optimization
- Joint inversion of high-frequency induction and lateral logging sounding data in Earth models with tilted principal axes of the electrical resistivity tensor
- Cluster Gauss-Newton method. An algorithm for finding multiple approximate minimisers of nonlinear least squares problems with applications to parameter estimation of pharmacokinetic models
- Optimal control of spins by analytical Lie algebraic derivatives
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Inexact derivative-free optimization for bilevel learning
- Direct Search Based on Probabilistic Descent in Reduced Spaces
- An empirical study of derivative-free-optimization algorithms for targeted black-box attacks in deep neural networks
- Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information
- Derivative-free bound-constrained optimization for solving structured problems with surrogate models
- Efficient Yield Optimization with Limited Gradient Information
- Effective matrix adaptation strategy for noisy derivative-free optimization
- Manifold sampling for optimizing nonsmooth nonconvex compositions
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- A geometric integration approach to nonsmooth, nonconvex optimisation
- Gravity in the infrared and effective nonlocal models
- Parallel-in-time optimization of induction motors
- Derivative-free optimization methods
Uses Software
This page was built for publication: Improving the flexibility and robustness of model-based derivative-free optimization solvers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4960952)