An Expectation-Maximization Approach to Nonlinear Component Analysis
From MaRDI portal
Recommendations
Cites work
Cited in
(17)- Two-phase incremental kernel PCA for learning massive or online datasets
- A Constrained EM Algorithm for Principal Component Analysis
- Bayesian ensemble learning for nonlinear factor analysis
- Efficient Tracking of the Dominant Eigenspace of a Normalized Kernel Matrix
- Nonlinear feature extraction based on centroids and kernel functions
- Efficiently updating and tracking the dominant kernel principal components
- On a nonlinear extension of the principal fitted component model
- Orthogonal series density estimation and the kernel eigenvalue problem
- Kernel PCA for feature extraction and de-noising in nonlinear regression
- Independent Component Analysis and Blind Signal Separation
- scientific article; zbMATH DE number 2147268 (Why is no real title available?)
- Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis
- scientific article; zbMATH DE number 5968904 (Why is no real title available?)
- ECA: High-Dimensional Elliptical Component Analysis in Non-Gaussian Distributions
- The matrix ridge approximation: algorithms and applications
- scientific article; zbMATH DE number 2040623 (Why is no real title available?)
- Eigen-analysis of nonlinear PCA with polynomial kernels
This page was built for publication: An Expectation-Maximization Approach to Nonlinear Component Analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4814209)