Multiple kernel spectral regression for dimensionality reduction (Q1789907)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Multiple kernel spectral regression for dimensionality reduction |
scientific article; zbMATH DE number 6950669
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Multiple kernel spectral regression for dimensionality reduction |
scientific article; zbMATH DE number 6950669 |
Statements
Multiple kernel spectral regression for dimensionality reduction (English)
0 references
10 October 2018
0 references
Summary: Traditional manifold learning algorithms, such as locally linear embedding, Isomap, and Laplacian eigenmap, only provide the embedding results of the training samples. To solve the out-of-sample extension problem, spectral regression (SR) solves the problem of learning an embedding function by establishing a regression framework, which can avoid eigen-decomposition of dense matrices. Motivated by the effectiveness of SR, we incorporate multiple kernel learning (MKL) into SR for dimensionality reduction. The proposed approach (termed MKL-SR) seeks an embedding function in the Reproducing Kernel Hilbert Space (RKHS) induced by the multiple base kernels. An MKL-SR algorithm is proposed to improve the performance of kernel-based SR (KSR) further. Furthermore, the proposed MKL-SR algorithm can be performed in the supervised, unsupervised, and semi-supervised situation. Experimental results on supervised classification and semi-supervised classification demonstrate the effectiveness and efficiency of our algorithm.
0 references
0.91393775
0 references
0.89898664
0 references
0.8969299
0 references
0.8949854
0 references
0.89385355
0 references
0.8921564
0 references