Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information

From MaRDI portal
Publication:6088560

DOI10.1007/S11081-023-09795-YarXiv2204.05022OpenAlexW4365449150MaRDI QIDQ6088560FDOQ6088560


Authors: Mona Fuhrländer, Sebastian Schöps Edit this on Wikidata


Publication date: 16 November 2023

Published in: Optimization and Engineering (Search for Journal in Brave)

Abstract: In this work we propose two Hermite-type optimization methods, Hermite least squares and Hermite BOBYQA, specialized for the case that some partial derivatives of the objective function are available and others are not. The main objective is to reduce the number of objective function calls by maintaining the convergence properties. Both methods are modifications of Powell's derivative-free BOBYQA algorithm. But instead of (underdetermined) interpolation for building the quadratic subproblem in each iteration, the training data is enriched with first and -- if possible -- second order derivatives and then (weighted) least squares regression is used. Proofs for global convergence are discussed and numerical results are presented. Further, the applicability is verified for a realistic test case in the context of yield optimization. Numerical tests show that the Hermite least squares approach outperforms classic BOBYQA if half or more partial derivatives are available. In addition, the Hermite-type approaches achieve more robustness and thus better performance in case of noisy objective functions.


Full work available at URL: https://arxiv.org/abs/2204.05022




Recommendations




Cites Work






This page was built for publication: Hermite least squares optimization: a modification of BOBYQA for optimization with limited derivative information

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6088560)