Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
From MaRDI portal
Publication:5495779
Abstract: Low-distortion embeddings are critical building blocks for developing random sampling and random projection algorithms for linear algebra problems. We show that, given a matrix with and a , with a constant probability, we can construct a low-distortion embedding matrix that embeds , the subspace spanned by 's columns, into ; the distortion of our embeddings is only , and we can compute in time, i.e., input-sparsity time. Our result generalizes the input-sparsity time subspace embedding by Clarkson and Woodruff [STOC'13]; and for completeness, we present a simpler and improved analysis of their construction for . These input-sparsity time embeddings are optimal, up to constants, in terms of their running time; and the improved running time propagates to applications such as -distortion subspace embedding and relative-error regression. For , we show that a -approximate solution to the regression problem specified by the matrix and a vector can be computed in time; and for , via a subspace-preserving sampling procedure, we show that a -distortion embedding of into can be computed in time, and we also show that a -approximate solution to the regression problem can be computed in time. Moreover, we can improve the embedding dimension or equivalently the sample size to without increasing the complexity.
Recommendations
- A framework for robust subspace learning
- Subspace learning by \(\ell^0\)-induced sparsity
- Linear embedding by joint robust discriminant analysis and inter-class sparsity
- Sublinear Cost Low Rank Approximation via Subspace Sampling
- Fast, robust and non-convex subspace recovery
- Robust non-parametric regression via incoherent subspace projections
- Robust Subspace Discovery via Relaxed Rank Minimization
- Input sparsity time low-rank approximation via ridge leverage score sampling
Cited in
(42)- Approximate Newton methods
- Quantile regression for large-scale data via sparse exponential transform method
- Sublinear update time randomized algorithms for dynamic graph regression
- Sparser Johnson-Lindenstrauss transforms
- An efficient algorithm for computing the approximate t-URV and its applications
- Estimating Leverage Scores via Rank Revealing Methods and Randomization
- Fast and accurate randomized algorithms for linear systems and eigenvalue problems
- Randomized Quaternion Singular Value Decomposition for Low-Rank Matrix Approximation
- ISLET: fast and optimal low-rank tensor regression via importance sketching
- Single pass spectral sparsification in dynamic streams
- Structural Convergence Results for Approximation of Dominant Subspaces from Block Krylov Spaces
- Erasure coding for fault-oblivious linear system solvers
- Randomized numerical linear algebra: Foundations and algorithms
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Weighted SGD for \(\ell_p\) regression with randomized preconditioning
- Nearly tight oblivious subspace embeddings by trace inequalities
- Sharper bounds for regularized data fitting
- Scalable kernel \(k\)-means clustering with Nyström approximation: relative-error bounds
- Optimal subsampling for large‐sample quantile regression with massive data
- Spectral estimation from simulations via sketching
- \texttt{pylspack}: parallel algorithms and data structures for sketching, column subset selection, regression, and leverage scores
- Simpler is better: a comparative study of randomized pivoting algorithms for CUR and interpolative decompositions
- Tight Bounds for ℓ 1 Oblivious Subspace Embeddings
- On approximating matrix norms in data streams
- Online row sampling
- Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
- Practical sketching algorithms for low-rank matrix approximation
- Tight bounds for \(\ell_p\) oblivious subspace embeddings
- On randomized sketching algorithms and the Tracy-Widom law
- Sampling Algorithms and Coresets for $\ell_p$ Regression
- Structured random sketching for PDE inverse problems
- Efficient bounds and estimates for canonical angles in randomized subspace approximations
- Sampling Lasso quantile regression for large-scale data
- New subset selection algorithms for low rank approximation: offline and online
- A User-Friendly Computational Framework for Robust Structured Regression with the L2 Criterion
- scientific article; zbMATH DE number 7625175 (Why is no real title available?)
- Principled interpolation of Green's functions learned from data
- Streaming low-rank matrix approximation with an application to scientific simulation
- Tensor-structured sketching for constrained least squares
- Approximate sparse linear regression
- A count sketch maximal weighted residual Kaczmarz method for solving highly overdetermined linear systems
- Randomized algorithms in numerical linear algebra
This page was built for publication: Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5495779)