A unified analysis of stochastic gradient‐free Frank–Wolfe methods
From MaRDI portal
Publication:6092499
DOI10.1111/itor.12889OpenAlexW3097077685MaRDI QIDQ6092499
Xiantao Xiao, Jiahong Guo, Huiling Liu
Publication date: 23 November 2023
Published in: International Transactions in Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/itor.12889
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Implementation of an optimal first-order method for strongly convex total variation regularization
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Complexity bounds for primal-dual methods minimizing the model of objective function
- First-order and stochastic optimization methods for machine learning
- Random gradient-free minimization of convex functions
- Generalized Conditional Gradient for Sparse Estimation
- Non-convex Optimization for Machine Learning
- Derivative-free optimization methods
- Surrogate‐based methods for black‐box optimization
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- New analysis and results for the Frank-Wolfe method
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
This page was built for publication: A unified analysis of stochastic gradient‐free Frank–Wolfe methods