RidgeSketch: a fast sketching based solver for large scale ridge regression
From MaRDI portal
Publication:5099418
Recommendations
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Faster kernel ridge regression using sketching and preconditioning
- Fast regression with an \(\ell_{\infty}\) guarantee
- Sketching meets random projection in the dual: a provable recovery algorithm for big and high-dimensional data
- A statistical perspective on randomized sketching for ordinary least-squares
Cites work
- scientific article; zbMATH DE number 2086663 (Why is no real title available?)
- scientific article; zbMATH DE number 3027894 (Why is no real title available?)
- A fast randomized algorithm for the approximation of matrices
- A new theoretical estimate for the convergence rate of the maximal weighted residual Kaczmarz algorithm
- A randomized Kaczmarz algorithm with exponential convergence
- A sampling Kaczmarz-Motzkin algorithm for linear feasibility
- Accelerated sampling Kaczmarz Motzkin algorithm for the linear feasibility problem
- An accelerated randomized Kaczmarz algorithm
- An improved data stream summary: the count-min sketch and its applications
- Blendenpik: Supercharging LAPACK's Least-Squares Solver
- Convergence properties of the randomized extended Gauss-Seidel and Kaczmarz methods
- Coordinate descent algorithms
- Extensions of Lipschitz mappings into a Hilbert space
- Fast dimension reduction using Rademacher series on dual BCH codes
- Faster randomized block Kaczmarz algorithms
- Greed Works: An Improved Analysis of Sampling Kaczmarz--Motzkin
- Improved analysis of the subsampled randomized Hadamard transform
- Improved matrix algorithms via the subsampled randomized Hadamard transform
- Iterative Hessian sketch: fast and accurate solution approximation for constrained least-squares
- Kernel ridge regression
- LAPACK Users' Guide
- Methods of conjugate gradients for solving linear systems
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- On greedy randomized Kaczmarz method for solving large sparse linear systems
- Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with subconstant error
- Randomized Algorithms for Matrices and Data
- Randomized Sketches of Convex Programs With Sharp Guarantees
- Randomized iterative methods for linear systems
- Randomized methods for linear constraints: convergence rates and conditioning
- Scikit-learn: machine learning in Python
- Single projection Kaczmarz extended algorithms
- Sketched Newton-Raphson
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Some methods of speeding up the convergence of iteration methods
- Stochastic reformulations of linear systems: algorithms and convergence theory
- The fast Johnson-Lindenstrauss transform and approximate nearest neighbors
- Tikhonov Regularization and Total Least Squares
- Understanding machine learning. From theory to algorithms
- Unified Matrix Treatment of the Fast Walsh-Hadamard Transform
Cited in
(6)- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Regularized and structured tensor total least squares methods with applications
- Sharp Analysis of Sketch-and-Project Methods via a Connection to Randomized Singular Value Decomposition
- $\texttt{RidgeSketch}$: A Fast sketching based solver for large scale ridge regression
- Ridgesketch
- Faster kernel ridge regression using sketching and preconditioning
This page was built for publication: RidgeSketch: a fast sketching based solver for large scale ridge regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5099418)