A derivative-free optimization algorithm for the efficient minimization of functions obtained via statistical averaging
From MaRDI portal
Publication:1986101
Abstract: This paper considers the efficient minimization of the infinite time average of a stationary ergodic process in the space of a handful of design parameters which affect it. Problems of this class, derived from physical or numerical experiments which are sometimes expensive to perform, are ubiquitous in engineering applications. In such problems, any given function evaluation, determined with finite sampling, is associated with a quantifiable amount of uncertainty, which may be reduced via additional sampling. The present paper proposes a new optimization algorithm to adjust the amount of sampling associated with each function evaluation, making function evaluations more accurate (and, thus, more expensive), as required, as convergence is approached. The work builds on our algorithm for Delaunay-based Derivative-free Optimization via Global Surrogates (-DOGS). The new algorithm, dubbed -DOGS, substantially reduces the overall cost of the optimization process for problems of this important class. Further, under certain well-defined conditions, rigorous proof of convergence to the global minimum of the problem considered is established.
Recommendations
- Stochastic derivative-free optimization using a trust region framework
- Derivative-free optimization of expensive functions with computational error using weighted regression
- Derivative-free robust optimization by outer approximations
- A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling
- Delaunay-based derivative-free optimization via global surrogates. I: Linear constraints
Cites work
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 3533448 (Why is no real title available?)
- scientific article; zbMATH DE number 795280 (Why is no real title available?)
- scientific article; zbMATH DE number 847242 (Why is no real title available?)
- scientific article; zbMATH DE number 5485582 (Why is no real title available?)
- A branch and bound method for stochastic global optimization
- A method for stochastic constrained optimization using derivative-free surrogate pattern search and collocation
- A progressive barrier derivative-free trust-region algorithm for constrained optimization
- An introduction to polynomial and semi-algebraic optimization
- Average-cost control of stochastic manufacturing systems.
- Bounding averages rigorously using semidefinite programming: mean moments of the Lorenz system
- DNS-based predictive control of turbulence: An optimal benchmark for feedback algorithms
- Delaunay-based derivative-free optimization via global surrogates. II: Convex constraints
- Derivative-free and blackbox optimization
- Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
- Exploring or reducing noise? A global optimization algorithm in the presence of noise
- Gaussian processes for machine learning.
- Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points
- Implementation of Cartesian grids to accelerate Delaunay-based derivative-free optimization
- Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting
- Introduction to Derivative-Free Optimization
- Lipschitzian optimization without the Lipschitz constant
- Mesh-based Nelder-Mead algorithm for inequality constrained optimization
- Optimal aeroacoustic shape design using the surrogate management framework
- Suppression of vortex-shedding noise via derivative-free shape optimization
- \(X\)-armed bandits
Cited in
(6)- A branch-and-bound algorithm with growing datasets for large-scale parameter estimation
- A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling
- A derivative-free optimization algorithm based on conditional moments
- A new Bayesian approach to global optimization on parametrized surfaces in \(\mathbb{R}^3\)
- Optimization of stochastic blackboxes with adaptive precision
- Loss functions for finite sets
This page was built for publication: A derivative-free optimization algorithm for the efficient minimization of functions obtained via statistical averaging
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1986101)