On the relation of slow feature analysis and Laplacian eigenmaps
From MaRDI portal
Publication:2887008
Recommendations
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
- On the regularized Laplacian eigenmaps
- Slow Feature Analysis: Unsupervised Learning of Invariances
- What Is the Relation Between Slow Feature Analysis and Independent Component Analysis?
Cites work
- A theoretical basis for emergent pattern discrimination in neural systems through slow feature extraction
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Non-linear independent component analysis with diffusion maps
- Semi-supervised learning on Riemannian manifolds
- Slow Feature Analysis: A Theoretical Analysis of Optimal Free Responses
- Slow Feature Analysis: Unsupervised Learning of Invariances
- The first order asymptotics of the extreme eigenvectors of certain Hermitian Toeplitz matrices
- Towards a theoretical foundation for Laplacian-based manifold methods
Cited in
(14)- What Is the Relation Between Slow Feature Analysis and Independent Component Analysis?
- How to solve classification and regression problems on high-dimensional data with a supervised extension of slow feature analysis
- Effective dimensionality reduction for visualizing neural dynamics by Laplacian eigenmaps
- Nonlinear dimensionality reduction using a temporal coherence principle
- Graph-based predictable feature analysis
- Improved graph-based SFA: information preservation complements the slowness principle
- On the regularized Laplacian eigenmaps
- Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis
- Optimal Curiosity-Driven Modular Incremental Slow Feature Analysis
- Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs
- A Maximum-Likelihood Interpretation for Slow Feature Analysis
- Incremental slow feature analysis: adaptive low-complexity slow feature updating from high-dimensional input streams
- A dive into spectral inference networks: improved algorithms for self-supervised learning of continuous spectral representations
- Continual curiosity-driven skill acquisition from high-dimensional video inputs for humanoid robots
This page was built for publication: On the relation of slow feature analysis and Laplacian eigenmaps
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2887008)