Learning in high-dimensional feature spaces using ANOVA-based fast matrix-vector multiplication
From MaRDI portal
Abstract: Kernel matrices are crucial in many learning tasks such as support vector machines or kernel ridge regression. The kernel matrix is typically dense and large-scale. Depending on the dimension of the feature space even the computation of all of its entries in reasonable time becomes a challenging task. For such dense matrices the cost of a matrix-vector product scales quadratically with the dimensionality N , if no customized methods are applied. We propose the use of an ANOVA kernel, where we construct several kernels based on lower-dimensional feature spaces for which we provide fast algorithms realizing the matrix-vector products. We employ the non-equispaced fast Fourier transform (NFFT), which is of linear complexity for fixed accuracy. Based on a feature grouping approach, we then show how the fast matrix-vector products can be embedded into a learning method choosing kernel ridge regression and the conjugate gradient solver. We illustrate the performance of our approach on several data sets.
Recommendations
- Approximation of high-dimensional kernel matrices by multilevel circulant matrices
- On the Nyström method for approximating a gram matrix for improved kernel-based learning
- Learning Theory
- Randomized Nyström features for fast regression: an error analysis
- Learning low-rank kernel matrices with column-based methods
Cites work
- scientific article; zbMATH DE number 1953444 (Why is no real title available?)
- A note on fast Fourier transforms for nonequispaced grids
- ASKIT: approximate skeletonization kernel-independent treecode in high dimensions
- Advanced Lectures on Machine Learning
- Approximation of high-dimensional periodic functions with Fourier-based methods
- Fast Fourier Transforms for Nonequispaced Data
- Fast NFFT based summation of radial functions
- Fast Summation at Nonequispaced Knots by NFFT
- Fast convolution with radial kernels at nonequispaced knots
- Faster kernel ridge regression using sketching and preconditioning
- Kernel methods in machine learning
- Learning multivariate functions with low-dimensional structures using polynomial bases
- Methods of conjugate gradients for solving linear systems
- Multiple kernel learning algorithms
- On the fast Fourier transform of functions with singularities
- Pattern recognition and machine learning.
- Using NFFT 3 -- a software library for various nonequispaced fast Fourier transforms
Cited in
(5)- Fast kernel summation in high dimensions via slicing and Fourier transforms
- Unbalanced optimal transport and maximum mean discrepancies: interconnections and rapid evaluation
- Sparse additive function decompositions facing basis transforms
- ASKIT: an efficient, parallel library for high-dimensional kernel summations
- Approximation of high-dimensional kernel matrices by multilevel circulant matrices
This page was built for publication: Learning in high-dimensional feature spaces using ANOVA-based fast matrix-vector multiplication
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2087418)