An accelerated variance reducing stochastic method with Douglas-Rachford splitting
From MaRDI portal
Publication:2425236
Recommendations
- A proximal stochastic gradient method with progressive variance reduction
- Minimizing finite sums with the stochastic average gradient
- Asymptotic estimates for \(r\)-Whitney numbers of the second kind
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- A line search based proximal stochastic gradient algorithm with dynamical variance reduction
Cites work
- A Stochastic Approximation Method
- A proximal stochastic gradient method with progressive variance reduction
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Catalyst acceleration for first-order convex optimization: from theory to practice
- Convex analysis and monotone operator theory in Hilbert spaces
- Efficient online and batch learning using forward backward splitting
- Introductory lectures on convex optimization. A basic course.
- Katyusha: the first direct acceleration of stochastic gradient methods
- Minimizing finite sums with the stochastic average gradient
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Optimization methods for large-scale machine learning
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Sparse online learning via truncated gradient
- Stochastic dual coordinate ascent methods for regularized loss minimization
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
Cited in
(7)- Approximation of backward stochastic partial differential equations by a splitting-up method
- Anderson Accelerated Douglas--Rachford Splitting
- scientific article; zbMATH DE number 6719722 (Why is no real title available?)
- Accelerated stochastic variance reduction for a class of convex optimization problems
- A line search based proximal stochastic gradient algorithm with dynamical variance reduction
- A proximal stochastic gradient method with progressive variance reduction
- Efficient algorithms for implementing incremental proximal-point methods
This page was built for publication: An accelerated variance reducing stochastic method with Douglas-Rachford splitting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425236)