Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
DOI10.1007/s10589-020-00220-zzbMath1466.90065arXiv1712.09677OpenAlexW3088745370MaRDI QIDQ2023684
Nicolas Loizou, Peter Richtárik
Publication date: 3 May 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.09677
convex optimizationlinear systemsquadratic optimizationstochastic methodsstochastic gradient descentrandomized Kaczmarzrandomized coordinate descentheavy ball momentumstochastic Newton
Analysis of algorithms and problem complexity (68Q25) Analysis of algorithms (68W40) Convex programming (90C25) Quadratic programming (90C20) Stochastic programming (90C15) Iterative numerical methods for linear systems (65F10) Random matrices (algebraic aspects) (15B52) Complexity and performance of numerical algorithms (65Y20) Randomized algorithms (68W20) Linear equations (linear algebraic aspects) (15A06)
Related Items (max. 100)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Parallel coordinate descent methods for big data optimization
- Randomized block Kaczmarz method with projection for solving least squares
- Minimizing finite sums with the stochastic average gradient
- Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma
- Old and new results on algebraic connectivity of graphs
- iPiasco: inertial proximal algorithm for strongly convex optimization
- Randomized Kaczmarz solver for noisy linear systems
- A randomized Kaczmarz algorithm with exponential convergence
- A limit theorem for the norm of random matrices
- Stochastic heavy ball
- Coordinate descent algorithms
- Convergence rates for Kaczmarz-type algorithms
- Paved with good intentions: analysis of a randomized block Kaczmarz method
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent with arbitrary sampling I: algorithms and complexity†
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- Randomized Extended Kaczmarz for Solving Least Squares
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- An accelerated randomized Kaczmarz algorithm
- Convergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz Methods
- Randomized Iterative Methods for Linear Systems
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
- Random Geometric Graphs
- Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Least-squares solution of overdetermined inconsistent linear systems using kaczmarz's relaxation
- Katyusha: the first direct acceleration of stochastic gradient methods
- Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- A Convergent Incremental Gradient Method with a Constant Step Size
- Some methods of speeding up the convergence of iteration methods
- Linear Recursive Sequences
- A Stochastic Approximation Method
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
- Heavy-ball method in nonconvex optimization problems
This page was built for publication: Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods