Fast subspace approximation via greedy least-squares

From MaRDI portal




Abstract: In this note, we develop fast and deterministic dimensionality reduction techniques for a family of subspace approximation problems. Let PsubsetmathbbmRN be a given set of M points. The techniques developed herein find an O(nlogM)-dimensional subspace that is guaranteed to always contain a near-best fit n-dimensional hyperplane mathcalH for P with respect to the cumulative projection error , for any chosen p>2. The deterministic algorithm runs in ildeO(MN2)-time, and can be randomized to run in only ildeO(MNn)-time while maintaining its error guarantees with high probability. In the case p=infty the dimensionality reduction techniques can be combined with efficient algorithms for computing the John ellipsoid of a data set in order to produce an n-dimensional subspace whose maximum ell2-distance to any point in the convex hull of P is minimized. The resulting algorithm remains ildeO(MNn)-time. In addition, the dimensionality reduction techniques developed herein can also be combined with other existing subspace approximation algorithms for 2<pleqinfty - including more accurate algorithms based on convex programming relaxations - in order to reduce their runtimes.









This page was built for publication: Fast subspace approximation via greedy least-squares

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q745854)