On the Relation of Slow Feature Analysis and Laplacian Eigenmaps
From MaRDI portal
Publication:2887008
DOI10.1162/NECO_a_00214zbMath1237.68163OpenAlexW1981899483WikidataQ37932555 ScholiaQ37932555MaRDI QIDQ2887008
Publication date: 15 May 2012
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00214
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Related Items (6)
Continual curiosity-driven skill acquisition from high-dimensional video inputs for humanoid robots ⋮ A dive into spectral inference networks: improved algorithms for self-supervised learning of continuous spectral representations ⋮ Graph-based predictable feature analysis ⋮ Optimal Curiosity-Driven Modular Incremental Slow Feature Analysis ⋮ Incremental Slow Feature Analysis: Adaptive Low-Complexity Slow Feature Updating from High-Dimensional Input Streams ⋮ Improved graph-based SFA: information preservation complements the slowness principle
Uses Software
Cites Work
- Semi-supervised learning on Riemannian manifolds
- Non-linear independent component analysis with diffusion maps
- Towards a theoretical foundation for Laplacian-based manifold methods
- The first order asymptotics of the extreme eigenvectors of certain Hermitian Toeplitz matrices
- A Theoretical Basis for Emergent Pattern Discrimination in Neural Systems Through Slow Feature Extraction
- Slow Feature Analysis: Unsupervised Learning of Invariances
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Slow Feature Analysis: A Theoretical Analysis of Optimal Free Responses
This page was built for publication: On the Relation of Slow Feature Analysis and Laplacian Eigenmaps