A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling
From MaRDI portal
Publication:4641668
DOI10.1137/15M1031679zbMath1390.90410OpenAlexW2605076685MaRDI QIDQ4641668
Irina S. Dolinskaya, Jeremy Staum, Alvaro Maggiar, Andreas Wächter
Publication date: 18 May 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/15m1031679
Monte Carlo samplingderivative-free optimizationtrust-region methodGaussian smoothingadaptive multiple importance samplingdeterministic computational noise
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Stochastic programming (90C15)
Related Items
Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases, A theoretical and empirical comparison of gradient approximations in derivative-free optimization, Finite Difference Gradient Approximation: To Randomize or Not?, Robust design optimization for enhancing delamination resistance of composites, Two-stage nested simulation of tail risk measurement: a likelihood ratio approach, Limiting behaviour of the generalized simplex gradient as the number of points tends to infinity on a fixed shape in \(\mathrm{IR}^n\), Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods, Bilevel parameter learning for nonlocal image denoising models, A stochastic subspace approach to gradient-free optimization in high dimensions, Adaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivatives, Derivative-free optimization methods, Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise, Linesearch Newton-CG methods for convex optimization with noise
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Stochastic derivative-free optimization using a trust region framework
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Stochastic optimization using a trust-region method and random models
- UOBYQA: unconstrained optimization by quadratic approximation
- Probability essentials.
- The effect of deterministic noise in subgradient methods
- Geometry of interpolation sets in derivative free optimization
- Convergence of Trust-Region Methods Based on Probabilistic Models
- Estimating Derivatives of Noisy Simulations
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- Estimating Computational Noise
- Adaptive Multiple Importance Sampling
- Implicit Filtering
- Interaction of finite-amplitude waves with vertically sheared current fields
- On the geometry phase in model-based algorithms for derivative-free optimization
- Introduction to Derivative-Free Optimization
- `` Direct Search Solution of Numerical and Statistical Problems
- Testing Unconstrained Optimization Software
- Trust Region Methods
- Safe and Effective Importance Sampling
- ASTRO-DF: A Class of Adaptive Sampling Trust-Region Algorithms for Derivative-Free Stochastic Optimization
- Advances and Trends in Optimization with Engineering Applications
- Adaptative Monte Carlo Method, A Variance Reduction Technique
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Derivative-Free Optimization of Expensive Functions with Computational Error Using Weighted Regression
- A Simplex Method for Function Minimization
- Wedge trust region method for derivative free optimization.