Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
DOI10.1145/2488608.2488621zbMATH Open1293.68150arXiv1210.3135OpenAlexW2134342155MaRDI QIDQ5495779FDOQ5495779
Authors: Xiangrui Meng, Michael W. Mahoney
Publication date: 7 August 2014
Published in: Proceedings of the forty-eighth annual ACM symposium on Theory of Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1210.3135
Recommendations
- A framework for robust subspace learning
- Subspace learning by \(\ell^0\)-induced sparsity
- Linear embedding by joint robust discriminant analysis and inter-class sparsity
- Sublinear Cost Low Rank Approximation via Subspace Sampling
- Fast, robust and non-convex subspace recovery
- Robust non-parametric regression via incoherent subspace projections
- Robust Subspace Discovery via Relaxed Rank Minimization
- Input sparsity time low-rank approximation via ridge leverage score sampling
Linear regression; mixed models (62J05) Analysis of algorithms and problem complexity (68Q25) Randomized algorithms (68W20) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17)
Cited In (42)
- Sublinear update time randomized algorithms for dynamic graph regression
- Sparser Johnson-Lindenstrauss transforms
- Estimating Leverage Scores via Rank Revealing Methods and Randomization
- An efficient algorithm for computing the approximate t-URV and its applications
- Fast and accurate randomized algorithms for linear systems and eigenvalue problems
- Randomized Quaternion Singular Value Decomposition for Low-Rank Matrix Approximation
- ISLET: fast and optimal low-rank tensor regression via importance sketching
- Single pass spectral sparsification in dynamic streams
- Structural Convergence Results for Approximation of Dominant Subspaces from Block Krylov Spaces
- Randomized numerical linear algebra: Foundations and algorithms
- Erasure coding for fault-oblivious linear system solvers
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Weighted SGD for \(\ell_p\) regression with randomized preconditioning
- Nearly tight oblivious subspace embeddings by trace inequalities
- Sharper bounds for regularized data fitting
- Optimal subsampling for large‐sample quantile regression with massive data
- Scalable kernel \(k\)-means clustering with Nyström approximation: relative-error bounds
- Spectral estimation from simulations via sketching
- \texttt{pylspack}: parallel algorithms and data structures for sketching, column subset selection, regression, and leverage scores
- Simpler is better: a comparative study of randomized pivoting algorithms for CUR and interpolative decompositions
- Tight Bounds for ℓ 1 Oblivious Subspace Embeddings
- On approximating matrix norms in data streams
- Online row sampling
- Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
- Tight bounds for \(\ell_p\) oblivious subspace embeddings
- Practical sketching algorithms for low-rank matrix approximation
- On randomized sketching algorithms and the Tracy-Widom law
- Sampling Algorithms and Coresets for $\ell_p$ Regression
- Structured random sketching for PDE inverse problems
- Efficient bounds and estimates for canonical angles in randomized subspace approximations
- Sampling Lasso quantile regression for large-scale data
- New subset selection algorithms for low rank approximation: offline and online
- A User-Friendly Computational Framework for Robust Structured Regression with the L2 Criterion
- Principled interpolation of Green's functions learned from data
- Title not available (Why is that?)
- Streaming low-rank matrix approximation with an application to scientific simulation
- Tensor-structured sketching for constrained least squares
- Approximate sparse linear regression
- Randomized algorithms in numerical linear algebra
- A count sketch maximal weighted residual Kaczmarz method for solving highly overdetermined linear systems
- Approximate Newton methods
- Quantile regression for large-scale data via sparse exponential transform method
Uses Software
This page was built for publication: Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5495779)