Zeroth-order methods for noisy Hölder-gradient functions
From MaRDI portal
Publication:2162695
DOI10.1007/s11590-021-01742-zzbMath1497.90156arXiv2006.11857OpenAlexW3157347217MaRDI QIDQ2162695
Pavel Dvurechensky, Innokentiy Shibaev, Alexander V. Gasnikov
Publication date: 9 August 2022
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.11857
Related Items (5)
Oracle complexity separation in convex optimization ⋮ Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Recent Theoretical Advances in Non-Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Universal gradient methods for convex optimization problems
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Random gradient-free minimization of convex functions
- Introduction to Derivative-Free Optimization
- Introduction to Stochastic Search and Optimization
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Derivative-free optimization methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Stochastic Approximation of Minima with Improved Asymptotic Speed
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
This page was built for publication: Zeroth-order methods for noisy Hölder-gradient functions