On the regularized Laplacian eigenmaps
From MaRDI portal
Publication:419255
Recommendations
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Consistency of regularized spectral clustering
- Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
- On the relation of slow feature analysis and Laplacian eigenmaps
- Dependence of locally linear embedding on the regularization parameter
Cites work
- Advances in neural information processing systems 19. Proceedings of the 2006 conference, Vancouver, BC, Canada, December 4--6, 2006
- Capacity of reproducing kernel spaces in learning theory
- Consistency of regularized spectral clustering
- Consistency of spectral clustering
- Diffusion maps
- Graph-Based Semi-Supervised Learning and Spectral Kernel Design
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Learning Theory
- Learning Theory
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Manifold regularization: a geometric framework for learning from labeled and unlabeled examples
- On the effectiveness of Laplacian normalization for graph semi-supervised learning
- The covering number in learning theory
- Theory of Reproducing Kernels
- Towards a theoretical foundation for Laplacian-based manifold methods
- Weighted locally linear embedding for dimension reduction
Cited in
(5)- On the relation of slow feature analysis and Laplacian eigenmaps
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Geodesic distance-based generalized Gaussian Laplacian eigenmap
- Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
- Regularized principal component analysis
This page was built for publication: On the regularized Laplacian eigenmaps
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q419255)