Improving the flexibility and robustness of model-based derivative-free optimization solvers
From MaRDI portal
Publication:4960952
Abstract: We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear Least-Squares (LS) problems, with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS allows flexible initialization for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, the construction of regression-based models, and multiple restart strategies together with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over both sampling and regression techniques. We also present our package Py-BOBYQA, a Python implementation of BOBYQA (Powell, 2009), which also implements robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce a new adaptive measure of accuracy for the data profiles of noisy functions that strikes a balance between measuring the true and the noisy objective improvement.
Recommendations
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- A derivative-free Gauss-Newton method
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
Cites work
- scientific article; zbMATH DE number 1049347 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A derivative-free Gauss-Newton method
- A derivative-free algorithm for least-squares minimization
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization
- BFO, a trainable derivative-free brute force optimizer for nonlinear bound-constrained optimization and equilibrium computations with continuous and discrete variables
- Benchmarking Derivative-Free Optimization Algorithms
- Benchmarking optimization software with performance profiles.
- Best practices for comparing optimization algorithms
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Derivative-free and blackbox optimization
- Derivative-free optimization of expensive functions with computational error using weighted regression
- Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition
- Direct Multisearch for Multiobjective Optimization
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Global optimization. Theory, algorithms, and applications
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Introduction to Derivative-Free Optimization
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Methods to compare expensive stochastic optimization algorithms with random restarts
- Non-intrusive termination of noisy optimization
- On trust region methods for unconstrained minimization without derivatives
- Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms
- Robust preconditioners for the matrix free truncated Newton method.
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- Stochastic optimization using a trust-region method and random models
- Testing Unconstrained Optimization Software
- The calculus of simplex gradients
- Trust Region Methods
- Variable-number sample-path optimization
Cited in
(26)- Model-based derivative-free methods for convex-constrained optimization
- Escaping local minima with local derivative-free methods: a numerical investigation
- A stochastic Levenberg-Marquardt method using random models with complexity results
- Arbitrage-Free Neural-SDE Market Models
- Exploiting Problem Structure in Derivative Free Optimization
- Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
- Decomposition in derivative-free optimization
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
- Cluster Gauss-Newton method. An algorithm for finding multiple approximate minimisers of nonlinear least squares problems with applications to parameter estimation of pharmacokinetic models
- Joint inversion of high-frequency induction and lateral logging sounding data in Earth models with tilted principal axes of the electrical resistivity tensor
- Optimal control of spins by analytical Lie algebraic derivatives
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Inexact derivative-free optimization for bilevel learning
- Direct Search Based on Probabilistic Descent in Reduced Spaces
- An empirical study of derivative-free-optimization algorithms for targeted black-box attacks in deep neural networks
- Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information
- Derivative-free bound-constrained optimization for solving structured problems with surrogate models
- Efficient Yield Optimization with Limited Gradient Information
- Effective matrix adaptation strategy for noisy derivative-free optimization
- Manifold sampling for optimizing nonsmooth nonconvex compositions
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- A geometric integration approach to nonsmooth, nonconvex optimisation
- Gravity in the infrared and effective nonlocal models
- Derivative-free optimization methods
- Parallel-in-time optimization of induction motors
This page was built for publication: Improving the flexibility and robustness of model-based derivative-free optimization solvers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4960952)