Simple and cumulative regret for continuous noisy optimization
From MaRDI portal
Recommendations
- Evolution strategies with additive noise: a convergence rate lower bound
- Handling expensive optimization with large noise
- A direct search algorithm for optimization with noisy function evaluations
- Approximate implementations of pure random search in the presence of noise
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
Cites work
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Continuous Kiefer-Wolfowitz Procedure for Random Processes
- A Family of Variable-Metric Methods Derived by Variational Means
- A Stochastic Approximation Method
- A new approach to variable metric algorithms
- Adaptive stochastic approximation by the simultaneous perturbation method
- Conditioning of Quasi-Newton Methods for Function Minimization
- Extremal Eigenvalues of Real Symmetric Matrices with Entries in an Interval
- Feedback and Weighting Mechanisms for Improving Jacobian Estimates in the Adaptive Simultaneous Perturbation Algorithm
- Handling expensive optimization with large noise
- Introduction to Stochastic Search and Optimization
- Lower rate of convergence for locating a maximum of a function
- Noisy optimization complexity under locality assumption
- On the Kiefer-Wolfowitz approximation method
- Pure exploration in finitely-armed and continuous-armed bandits
- Stochastic Approximation of Minima with Improved Asymptotic Speed
- Stochastic Estimation of the Maximum of a Regression Function
- The Convergence of a Class of Double-rank Minimization Algorithms
- The NEWUOA software for unconstrained optimization without derivatives
- Viability theory
Cited in
(5)
This page was built for publication: Simple and cumulative regret for continuous noisy optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q905845)