Learning low-dimensional nonlinear structures from high-dimensional noisy data: an integral operator approach
From MaRDI portal
Publication:6183757
DOI10.1214/23-aos2306arXiv2203.00126OpenAlexW4387828537MaRDI QIDQ6183757
Publication date: 4 January 2024
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.00126
Statistics on manifolds (62R30) Integral operators (47G10) Statistical aspects of big data and data science (62R07)
Cites Work
- On information plus noise kernel random matrices
- On Euclidean random matrices in high dimension
- The imbedding problem for Riemannian manifolds
- Geometry on probability spaces
- The spectrum of kernel random matrices
- Kernel methods in machine learning
- Data spectroscopy: eigenspaces of convolution operators and clustering
- The spectral norm of random inner-product kernel matrices
- On the distribution of the largest eigenvalue in principal components analysis
- Principal component analysis.
- Spectral convergence of graph Laplacian and heat kernel reconstruction in \(L^\infty\) from random samples
- Think globally, fit locally under the manifold setup: asymptotic analysis of locally linear embedding
- Optimality of spectral clustering in the Gaussian mixture model
- An \({\ell_p}\) theory of PCA and spectral clustering
- Statistical inference for principal components of spiked covariance matrices
- Error estimates for spectral convergence of the graph Laplacian on random geometric graphs toward the Laplace-Beltrami operator
- Statistical properties of kernel principal component analysis
- Consistency of spectral clustering
- Diffusion maps
- From graph to manifold Laplacian: the convergence rate
- Modern multidimensional scaling. Theory and applications.
- Learning theory estimates via integral operators and their approximations
- Singular vector and singular subspace distribution for the matrix denoising model
- Concentration of kernel matrices with application to kernel spectral clustering
- A Riemann-Stein kernel method
- THE SPECTRUM OF RANDOM KERNEL MATRICES: UNIVERSALITY RESULTS FOR ROUGH AND VARYING KERNELS
- THE SPECTRUM OF RANDOM INNER-PRODUCT KERNEL MATRICES
- Vector diffusion maps and the connection Laplacian
- Improving Spectral Clustering Using the Asymptotic Value of the Normalized Cut
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- An $\ell_{\infty}$ Eigenvector Perturbation Bound and Its Application to Robust Covariance Estimation
- Spectral convergence of the connection Laplacian from random samples
- Analysis of spectral clustering algorithms for community detection: the general bipartite setting
- High-Dimensional Probability
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Spectral Convergence of Diffusion Maps: Improved Error Bounds and an Alternative Normalization
- Spectral Methods for Data Science: A Statistical Perspective
- Clustering with t-SNE, Provably
- Lipschitz Regularity of Graph Laplacians on Random Data Clouds
- On the Spectral Property of Kernel-Based Sensor Fusion Algorithms of High Dimensional Data
- Kernel Methods and Machine Learning
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Local Linear Regression on Manifolds and Its Geometric Interpretation
- Nonlinear Dimensionality Reduction
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Graph Based Gaussian Processes on Restricted Domains
- Scalability and robustness of spectral embedding: landmark diffusion is all you need
- Graph connection Laplacian methods can be made robust to noise
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item