Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information
From MaRDI portal
Publication:6088560
Abstract: In this work we propose two Hermite-type optimization methods, Hermite least squares and Hermite BOBYQA, specialized for the case that some partial derivatives of the objective function are available and others are not. The main objective is to reduce the number of objective function calls by maintaining the convergence properties. Both methods are modifications of Powell's derivative-free BOBYQA algorithm. But instead of (underdetermined) interpolation for building the quadratic subproblem in each iteration, the training data is enriched with first and -- if possible -- second order derivatives and then (weighted) least squares regression is used. Proofs for global convergence are discussed and numerical results are presented. Further, the applicability is verified for a realistic test case in the context of yield optimization. Numerical tests show that the Hermite least squares approach outperforms classic BOBYQA if half or more partial derivatives are available. In addition, the Hermite-type approaches achieve more robustness and thus better performance in case of noisy objective functions.
Recommendations
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- A Derivative-Free Nonlinear Least Squares Solver
- Derivative-free optimization methods
- A derivative-free algorithm for least-squares minimization
Cites work
- scientific article; zbMATH DE number 5774819 (Why is no real title available?)
- scientific article; zbMATH DE number 4055377 (Why is no real title available?)
- scientific article; zbMATH DE number 653035 (Why is no real title available?)
- scientific article; zbMATH DE number 852536 (Why is no real title available?)
- A Simplex Method for Function Minimization
- Derivative-free and blackbox optimization
- Derivative-free optimization of expensive functions with computational error using weighted regression
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Escaping local minima with local derivative-free methods: a numerical investigation
- Geometry of interpolation sets in derivative free optimization
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Introduction to Derivative-Free Optimization
- Lipschitzian optimization without the Lipschitz constant
- On fast trust region methods for quadratic models with linear constraints
- On multivariate Hermite interpolation
- Optimization by simulated annealing
- UOBYQA: unconstrained optimization by quadratic approximation
- YIELD OPTIMIZATION BASED ON ADAPTIVE NEWTON-MONTE CARLO AND POLYNOMIAL SURROGATES
This page was built for publication: Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6088560)