On the predictive potential of kernel principal components (Q2283584): Difference between revisions
From MaRDI portal
Latest revision as of 09:04, 30 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On the predictive potential of kernel principal components |
scientific article |
Statements
On the predictive potential of kernel principal components (English)
0 references
3 January 2020
0 references
The authors consider kernel principal component analysis (KPCA) as a means for unsupervised dimension reduction. A core feature of this method implies that, if an operation depends only on inner products, then a lower-dimensional nonlinear projection of the data can be extracted without dealing directly with the projection coefficients. This idea also appears in other settings, such as the support vector machine and supervised dimension reduction. In the context of KPCA on a sample of vector-valued predictors, the authors study the question: would regressing a response on the leading components be more useful than regressing on the lower-ranking components? They show that, if an arbitrary distribution for the predictor \(X\) and an arbitrary conditional distribution for \(Y|X\) are chosen, then any measureable function \(g(Y)\), subject to a mild condition, tends to be more correlated with the higher-ranking kernel principal components than with the lower-ranking ones. The occurrence of this tendency in real world databases is also investigated.
0 references
Cauchy distribution
0 references
dimension reduction
0 references
nonparametric regression
0 references
kernel principal components
0 references
unitary invariance
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references