Convergence Analysis of Inexact Randomized Iterative Methods
DOI10.1137/19M125248XzbMath1505.68046arXiv1903.07971OpenAlexW3112635593MaRDI QIDQ5856678
Peter Richtárik, Nicolas Loizou
Publication date: 29 March 2021
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1903.07971
convex optimizationlinear systemsquadratic optimizationstochastic gradient descentiteration complexityinexact methodsrandomized block coordinate descentrandomized block Kaczmarzstochastic Newton method
Analysis of algorithms (68W40) Convex programming (90C25) Quadratic programming (90C20) Stochastic programming (90C15) Iterative numerical methods for linear systems (65F10) Random matrices (algebraic aspects) (15B52) Complexity and performance of numerical algorithms (65Y20) Randomized algorithms (68W20) Linear equations (linear algebraic aspects) (15A06)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- First-order methods of smooth convex optimization with inexact oracle
- Randomized block Kaczmarz method with projection for solving least squares
- Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma
- Randomized Kaczmarz solver for noisy linear systems
- A flexible coordinate descent method
- Linear convergence of the randomized sparse Kaczmarz method
- Newton-type methods for non-convex optimization under inexact Hessian information
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Convergence rates for Kaczmarz-type algorithms
- Paved with good intentions: analysis of a randomized block Kaczmarz method
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- Coordinate descent with arbitrary sampling I: algorithms and complexity†
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- Randomized Extended Kaczmarz for Solving Least Squares
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- Convergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz Methods
- Randomized Iterative Methods for Linear Systems
- Termination criteria for inexact fixed‐point schemes
- Inexact Newton Methods
- Numerical Optimization
- Non-asymptotic convergence analysis of inexact gradient methods for machine learning without strong convexity
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Least-squares solution of overdetermined inconsistent linear systems using kaczmarz's relaxation
- Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory
- An investigation of Newton-Sketch and subsampled Newton methods
- Order-Optimal Consensus Through Randomized Path Averaging