A stochastic variance reduced gradient method with adaptive step for stochastic optimization
From MaRDI portal
Publication:6565722
Recommendations
- Analysis and improvement for a class of variance reduced methods
- A stochastic variance reduced gradient using Barzilai-Borwein techniques as second order information
- Improved SVRG for finite sum structure optimization with application to binary classification
- Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization
- Stochastic variance reduced gradient methods using a trust-region-like scheme
Cites work
- A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems
- A Stochastic Approximation Method
- Adaptive subgradient methods for online learning and stochastic optimization
- An efficient gradient method using the Yuan steplength
- Analysis of monotone gradient methods
- Balancing rates and variance via adaptive batch-size for stochastic optimization problems
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- Gradient methods with adaptive step-sizes
- Minimizing finite sums with the stochastic average gradient
- New adaptive stepsize selections in gradient methods
- Optimization methods for large-scale machine learning
- Sample size selection in optimization methods for machine learning
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching
- Stochastic variance reduced gradient methods using a trust-region-like scheme
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Two-Point Step Size Gradient Methods
- Understanding machine learning. From theory to algorithms
Cited in
(7)- On adaptive stochastic heavy ball momentum for solving linear systems
- Scale invariant stochastic gradient method with momentum
- A stochastic gradient method with variance control and variable learning rate for deep learning
- A semismooth Newton stochastic proximal point algorithm with variance reduction
- A mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize
- A stochastic variance reduced gradient using Barzilai-Borwein techniques as second order information
- Accelerating stochastic sequential quadratic programming for equality constrained optimization using predictive variance reduction
This page was built for publication: A stochastic variance reduced gradient method with adaptive step for stochastic optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6565722)