Block mirror stochastic gradient method for stochastic optimization
From MaRDI portal
Publication:6158991
DOI10.1007/s10915-023-02110-yzbMath1519.90143MaRDI QIDQ6158991
Jinda Yang, Di Hou, Haiming Song, Xinxin Li
Publication date: 20 June 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
stochastic optimizationconvex optimizationnonconvex optimizationblock coordinate descentstochastic gradient
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An optimal method for stochastic composite optimization
- Minimizing finite sums with the stochastic average gradient
- Pegasos: primal estimated sub-gradient solver for SVM
- A coordinate gradient descent method for nonsmooth separable minimization
- Introductory lectures on convex optimization. A basic course.
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- The sample average approximation method applied to stochastic routing problems: a computational study
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version
- First-order and stochastic optimization methods for machine learning
- Parallel stochastic gradient algorithms for large-scale matrix completion
- The empirical behavior of sampling methods for stochastic programming
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Feature Article: Optimization for simulation: Theory vs. Practice
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Variational Analysis
- First-Order Methods in Optimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- A Stochastic Approximation Method
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Block mirror stochastic gradient method for stochastic optimization