A derivative-free optimization algorithm for the efficient minimization of functions obtained via statistical averaging

From MaRDI portal
Publication:1986101

DOI10.1007/S10589-020-00172-4zbMATH Open1433.90193arXiv1910.12393OpenAlexW3004639768MaRDI QIDQ1986101FDOQ1986101


Authors: Pooriya Beyhaghi, Ryan Alimo, Thomas R. Bewley Edit this on Wikidata


Publication date: 7 April 2020

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Abstract: This paper considers the efficient minimization of the infinite time average of a stationary ergodic process in the space of a handful of design parameters which affect it. Problems of this class, derived from physical or numerical experiments which are sometimes expensive to perform, are ubiquitous in engineering applications. In such problems, any given function evaluation, determined with finite sampling, is associated with a quantifiable amount of uncertainty, which may be reduced via additional sampling. The present paper proposes a new optimization algorithm to adjust the amount of sampling associated with each function evaluation, making function evaluations more accurate (and, thus, more expensive), as required, as convergence is approached. The work builds on our algorithm for Delaunay-based Derivative-free Optimization via Global Surrogates (Delta-DOGS). The new algorithm, dubbed alpha-DOGS, substantially reduces the overall cost of the optimization process for problems of this important class. Further, under certain well-defined conditions, rigorous proof of convergence to the global minimum of the problem considered is established.


Full work available at URL: https://arxiv.org/abs/1910.12393




Recommendations




Cites Work


Cited In (6)

Uses Software





This page was built for publication: A derivative-free optimization algorithm for the efficient minimization of functions obtained via statistical averaging

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1986101)