On the predictive potential of kernel principal components (Q2283584): Difference between revisions

From MaRDI portal
Changed an Item
Set OpenAlex properties.
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Theory of Reproducing Kernels / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sufficient dimension reduction via principal L\(q\) support vector machine / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3643286 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Predictive power of principal components for single-index model and sufficient dimension reduction / rank
 
Normal rank
Property / cites work
 
Property / cites work: Dimension reduction strategies for analyzing global gene expression data with a response / rank
 
Normal rank
Property / cites work
 
Property / cites work: Rejoinder: Fisher lecture: Dimension reduction in regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3996150 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3093182 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Kernel dimension reduction in regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Ordering and Selecting Components in Multivariate or Functional Data Linear Prediction / rank
 
Normal rank
Property / cites work
 
Property / cites work: Principal Curves / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Biometrics Invited Paper. The Analysis and Selection of Variables in Linear Regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators / rank
 
Normal rank
Property / cites work
 
Property / cites work: An RKHS formulation of the inverse regression dimension-reduction problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5452361 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Principal component analysis. / rank
 
Normal rank
Property / cites work
 
Property / cites work: On principal components regression with Hilbertian predictors / rank
 
Normal rank
Property / cites work
 
Property / cites work: Completely random measures / rank
 
Normal rank
Property / cites work
 
Property / cites work: A general theory for nonlinear sufficient dimension reduction: formulation and estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Comment: Fisher lecture: Dimension reduction in regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Principal support vector machines for linear and nonlinear sufficient dimension reduction / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonlinear sufficient dimension reduction for functional data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3219581 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Principal component regression revisited / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4272617 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4863755 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Data spectroscopy: eigenspaces of convolution operators and clustering / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smoothed functional principal components analysis by choice of norm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4172685 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3330236 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4261789 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2996823939 / rank
 
Normal rank

Latest revision as of 10:04, 30 July 2024

scientific article
Language Label Description Also known as
English
On the predictive potential of kernel principal components
scientific article

    Statements

    On the predictive potential of kernel principal components (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    3 January 2020
    0 references
    The authors consider kernel principal component analysis (KPCA) as a means for unsupervised dimension reduction. A core feature of this method implies that, if an operation depends only on inner products, then a lower-dimensional nonlinear projection of the data can be extracted without dealing directly with the projection coefficients. This idea also appears in other settings, such as the support vector machine and supervised dimension reduction. In the context of KPCA on a sample of vector-valued predictors, the authors study the question: would regressing a response on the leading components be more useful than regressing on the lower-ranking components? They show that, if an arbitrary distribution for the predictor \(X\) and an arbitrary conditional distribution for \(Y|X\) are chosen, then any measureable function \(g(Y)\), subject to a mild condition, tends to be more correlated with the higher-ranking kernel principal components than with the lower-ranking ones. The occurrence of this tendency in real world databases is also investigated.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    Cauchy distribution
    0 references
    dimension reduction
    0 references
    nonparametric regression
    0 references
    kernel principal components
    0 references
    unitary invariance
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references