On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
From MaRDI portal
Publication:3547627
DOI10.1109/TIT.2005.850052zbMath1310.15076OpenAlexW2139095575MaRDI QIDQ3547627
Christopher K. I. Williams, Nello Cristianini, John Shawe-Taylor, J. S. Kandola
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2005.850052
Factor analysis and principal components; correspondence analysis (62H25) Learning and adaptive systems in artificial intelligence (68T05) Eigenvalues, singular values, and eigenvectors (15A18) Random matrices (algebraic aspects) (15B52)
Related Items (37)
Non-asymptotic properties of spectral decomposition of large Gram-type matrices and applications ⋮ Neural-network-based approach for extracting eigenvectors and eigenvalues of real normal matrices and some extension to real matrices ⋮ Accuracy of suboptimal solutions to kernel principal component analysis ⋮ Statistical properties of kernel principal component analysis ⋮ Unnamed Item ⋮ Concentration of kernel matrices with application to kernel spectral clustering ⋮ Two-sample test for equal distributions in separate metric space: New maximum mean discrepancy based approaches ⋮ Universally consistent vertex classification for latent positions graphs ⋮ Compressive statistical learning with random feature moments ⋮ On spectral windows in supervised learning from data ⋮ Another neural network based approach for computing eigenvalues and eigenvectors of real skew-symmetric matrices ⋮ Some remarks on MCMC estimation of spectra of integral operators ⋮ Explicit embeddings for nearest neighbor search with Mercer kernels ⋮ Learning noisy linear classifiers via adaptive and selective sampling ⋮ Statistical performance of support vector machines ⋮ Nonasymptotic upper bounds for the reconstruction error of PCA ⋮ Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs ⋮ Transfer bounds for linear feature learning ⋮ Decomposing the tensor kernel support vector machine for neuroscience data with structured labels ⋮ The smallest eigenvalues of random kernel matrices: asymptotic results on the min kernel ⋮ Robust recovery of multiple subspaces by geometric \(l_{p}\) minimization ⋮ On the eigenvector bias of Fourier feature networks: from regression to solving multi-scale PDEs with physics-informed neural networks ⋮ Unsupervised slow subspace-learning from stationary processes ⋮ Robust dimension-free Gram operator estimates ⋮ Learning Theory ⋮ Statistical Analysis and Parameter Selection for Mapper ⋮ Dominated concentration ⋮ A spectral graph approach to discovering genetic ancestry ⋮ Basis operator network: a neural network-based model for learning nonlinear operators via neural basis ⋮ Sparse multiple kernel learning: minimax rates with random projection ⋮ Oracle inequalities for support vector machines that are based on random entropy numbers ⋮ Principal component analysis for multivariate extremes ⋮ Model reduction and neural networks for parametric PDEs ⋮ High-probability bounds for the reconstruction error of PCA ⋮ Random discretization of the finite Fourier transform and related kernel random matrices ⋮ Statistical analysis of Mapper for stochastic and multivariate filters ⋮ Approximate kernel PCA: computational versus statistical trade-off
This page was built for publication: On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA