Stochastic variance reduced gradient methods using a trust-region-like scheme
DOI10.1007/S10915-020-01402-XzbMATH Open1461.90071OpenAlexW3131341170MaRDI QIDQ1995995FDOQ1995995
Xinwei Liu, Jie Sun, Yuhong Dai, Tengteng Yu
Publication date: 2 March 2021
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10915-020-01402-x
Recommendations
- Analysis and improvement for a class of variance reduced methods
- A proximal stochastic gradient method with progressive variance reduction
- Improved SVRG for finite sum structure optimization with application to binary classification
- An analysis of stochastic variance reduced gradient for linear inverse problems *
- Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization
empirical risk minimizationtrust regionBarzilai-Borwein stepsizesmini-batchesstochastic variance reduced gradient
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90) Nonlinear programming (90C30)
Cites Work
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- A Stochastic Approximation Method
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Two-Point Step Size Gradient Methods
- Understanding Machine Learning
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Online learning for matrix factorization and sparse coding
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Recent advances in trust region algorithms
- Stochastic Gradient Descent on Riemannian Manifolds
- A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems
- Large-Scale Machine Learning with Stochastic Gradient Descent
- A Barzilai-Borwein type method for stochastic linear complementarity problems
- Optimization Methods for Large-Scale Machine Learning
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization
- Katyusha: the first direct acceleration of stochastic gradient methods
- On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression
Cited In (9)
- An analysis of stochastic variance reduced gradient for linear inverse problems *
- A stochastic variance reduced gradient method with adaptive step for stochastic optimization
- A mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize
- Stochastic gradient algorithm with random truncations
- Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching
- Nonconvex optimization with inertial proximal stochastic variance reduction gradient
- Stochastic optimization using a trust-region method and random models
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
Uses Software
This page was built for publication: Stochastic variance reduced gradient methods using a trust-region-like scheme
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1995995)