Think globally, fit locally under the manifold setup: asymptotic analysis of locally linear embedding
From MaRDI portal
(Redirected from Publication:1990602)
Abstract: Since its introduction in 2000, the locally linear embedding (LLE) has been widely applied in data science. We provide an asymptotical analysis of the LLE under the manifold setup. We show that for the general manifold, asymptotically we may not obtain the Laplace-Beltrami operator, and the result may depend on the non-uniform sampling, unless a correct regularization is chosen. We also derive the corresponding kernel function, which indicates that the LLE is not a Markov process. A comparison with the other commonly applied nonlinear algorithms, particularly the diffusion map, is provided, and its relationship with the locally linear regression is also discussed.
Recommendations
Cites work
- scientific article; zbMATH DE number 991833 (Why is no real title available?)
- scientific article; zbMATH DE number 52737 (Why is no real title available?)
- scientific article; zbMATH DE number 3367521 (Why is no real title available?)
- A variational approach to the consistency of spectral clustering
- Chernoff's theorem and discrete time approximations of Brownian motion on manifolds
- Consistency of spectral clustering
- Consistency properties of nearest neighbor density function estimators
- Diffusion maps
- Embedding Riemannian manifolds by their heat kernel
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- Finite propagation speed, kernel estimates for functions of the Laplace operator, and the geometry of complete Riemannian manifolds
- From graph to manifold Laplacian: the convergence rate
- Graph connection Laplacian methods can be made robust to noise
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Learning Theory
- Learning Theory
- Local linear regression on manifolds and its geometric interpretation
- On information plus noise kernel random matrices
- Optimal shrinkage of eigenvalues in the spiked covariance model
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- Spectral convergence of the connection Laplacian from random samples
- Spectral geometry: direct and inverse problems. With an appendix by G. Besson
- The strong uniform consistency of nearest neighbor density estimates
- Two-dimensional tomography from noisy projections taken at unknown random directions
- Vector diffusion maps and the connection Laplacian
- Visualizing data using t-SNE
Cited in
(11)- Rates of the strong uniform consistency with rates for conditional \(U\)-statistics estimators with general kernels on manifolds
- Eigen-convergence of Gaussian kernelized graph Laplacian by manifold heat interpolation
- Diffusion maps for embedded manifolds with boundary with applications to PDEs
- Embeddings of Riemannian manifolds with finite eigenvector fields of connection Laplacian
- Local linear regression on manifolds and its geometric interpretation
- Learning low-dimensional nonlinear structures from high-dimensional noisy data: an integral operator approach
- Spectral convergence of graph Laplacian and heat kernel reconstruction in \(L^\infty\) from random samples
- Rates of the strong uniform consistency for the kernel-type regression function estimators with general kernels on manifolds
- Data-driven efficient solvers for Langevin dynamics on manifold in high dimensions
- Time-series forecasting using manifold learning, radial basis function interpolation, and geometric harmonics
- Connecting dots: from local covariance to empirical intrinsic geometry and locally linear embedding
This page was built for publication: Think globally, fit locally under the manifold setup: asymptotic analysis of locally linear embedding
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1990602)