Low Rank Approximation at Sublinear Cost
From MaRDI portal
Publication:6320286
arXiv1906.04327MaRDI QIDQ6320286FDOQ6320286
Authors: Victor Y. Pan, Qi Luan, John Svadlenka, Liang Zhao
Publication date: 10 June 2019
Abstract: Low Rank Approximation (LRA) of an m-by-n matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with LRA can be performed at sublinear cost -- by using much fewer than mn memory cells and arithmetic operations, but can we compute LRA at sublinear cost? Yes and no. No, because spectral, Frobenius, and all other norms of the error matrix of LRA output by any sublinear cost deterministic or randomized algorithm exceed their minimal values for LRA by infinitely large factors for the worst case input and even for the inputs from the small families of our Appendix. Yes, because for about two decades Cross-Approximation (C-A) iterations, running at sublinear cost, have been consistently computing close LRA worldwide. We provide new insight into that "yes" and "no" coexistence by identifying C-A iterations as recursive sketching algorithms for LRA that use sampling test matrices and run at sublinear cost. As we prove in good accordance with our numerical tests, already at a single recursive step they compute close LRA. except for a narrow class of hard inputs, which tends to shrink in the recursive process. We also discuss enhancing the power of sketching by means of using leverage scores.
This page was built for publication: Low Rank Approximation at Sublinear Cost
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6320286)