Two-phase incremental kernel PCA for learning massive or online datasets
From MaRDI portal
Recommendations
- Adaptive kernel principal component analysis
- Generalized KPCA by adaptive rules in feature space
- Efficiently updating and tracking the dominant kernel principal components
- Efficient KPCA-Based Feature Extraction: A Novel Algorithm and Experiments
- Perturbation scheme for online learning of features: Incremental principal component analysis
Cites work
- scientific article; zbMATH DE number 6125590 (Why is no real title available?)
- A simplified neuron model as a principal component analyzer
- AI 2005: Advances in Artificial Intelligence
- An Expectation-Maximization Approach to Nonlinear Component Analysis
- An algorithm for coneigenvalues and coneigenvectors of quaternion matrices
- Application of kernel principal component analysis to multi-characteristic parameter design problems
- Fast iterative kernel principal component analysis
- Feedforward neural networks for principal components extraction.
- On incremental and robust subspace learning
- On the convergence of asynchronous parallel algorithm for large-scale linearly constrained minimization problem
- Principal component analysis.
Cited in
(4)
This page was built for publication: Two-phase incremental kernel PCA for learning massive or online datasets
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2325168)