Batched Stochastic Gradient Descent with Weighted Sampling
From MaRDI portal
Publication:4609808
DOI10.1007/978-3-319-59912-0_14zbMath1385.65041arXiv1608.07641OpenAlexW2963244042MaRDI QIDQ4609808
Publication date: 26 March 2018
Published in: Springer Proceedings in Mathematics & Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1608.07641
Numerical mathematical programming methods (65K05) Quadratic programming (90C20) Stochastic programming (90C15)
Related Items
Randomized Kaczmarz with averaging, Unnamed Item, A block-randomized stochastic method with importance sampling for CP tensor decomposition, Randomized Kaczmarz algorithm with averaging and block projection, Stochastic greedy algorithms for multiple measurement vectors
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On optimal probabilities in stochastic coordinate descent methods
- Two-subspace projection method for coherent overdetermined systems
- Minimizing finite sums with the stochastic average gradient
- Pegasos: primal estimated sub-gradient solver for SVM
- Sample size selection in optimization methods for machine learning
- Randomized Kaczmarz solver for noisy linear systems
- A randomized Kaczmarz algorithm with exponential convergence
- Introductory lectures on convex optimization. A basic course.
- Regularization tools version \(4.0\) for matlab \(7.3\)
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Decoding by Linear Programming
- Robust Stochastic Approximation Approach to Stochastic Programming
- Weighted SGD for ℓp Regression with Randomized Preconditioning
- Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Optimal Distributed Online Prediction using Mini-Batches
- A Stochastic Approximation Method
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm