Gaussian Sketching yields a J-L Lemma in RKHS
From MaRDI portal
Publication:6323745
arXiv1908.05818MaRDI QIDQ6323745FDOQ6323745
Authors: Samory Kpotufe, Bharath K. Sriperumbudur
Publication date: 15 August 2019
Abstract: The main contribution of the paper is to show that Gaussian sketching of a kernel-Gram matrix yields an operator whose counterpart in an RKHS , is a emph{random projection} operator---in the spirit of Johnson-Lindenstrauss (J-L) lemma. To be precise, given a random matrix with i.i.d. Gaussian entries, we show that a sketch corresponds to a particular random operator in (infinite-dimensional) Hilbert space that maps functions to a low-dimensional space , while preserving a weighted RKHS inner-product of the form , where is the emph{covariance} operator induced by the data distribution. In particular, under similar assumptions as in kernel PCA (KPCA), or kernel -means (K--means), well-separated subsets of feature-space remain well-separated after such operation, which suggests similar benefits as in KPCA and/or K--means, albeit at the much cheaper cost of a random projection. In particular, our convergence rates suggest that, given a large dataset of size , we can build the Gram matrix on a much smaller subsample of size , so that the sketch is very cheap to obtain and subsequently apply as a projection operator on the original data . We verify these insights empirically on synthetic data, and on real-world clustering applications.
This page was built for publication: Gaussian Sketching yields a J-L Lemma in RKHS
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6323745)