An \({\ell_p}\) theory of PCA and spectral clustering
From MaRDI portal
Publication:2091846
DOI10.1214/22-AOS2196MaRDI QIDQ2091846
Emmanuel Abbe, Kaizheng Wang, Jianqing Fan
Publication date: 2 November 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.14062
principal component analysisphase transitionsmixture modelscommunity detectionspectral clusteringeigenvector perturbationcontextual network models
Factor analysis and principal components; correspondence analysis (62H25) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Related Items
Estimation of misclassification rate in the Asymptotic Rare and Weak model with sub-Gaussian noises, Learning low-dimensional nonlinear structures from high-dimensional noisy data: an integral operator approach
Cites Work
- Minimax rates of community detection in stochastic block models
- The singular values and vectors of low rank perturbations of large rectangular random matrices
- Community detection in networks with node features
- Concentration inequalities and moment bounds for sample covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Influential features PCA for high dimensional clustering
- A spectral algorithm for learning mixture models
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- PCA consistency in high dimension, low sample size context
- Random perturbation of low rank matrices: improving classical bounds
- Robust covariance estimation for approximate factor models
- Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics
- On the distribution of the largest eigenvalue in principal components analysis
- Random matrix approximation of spectra of integral operators
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Asymptotics of empirical eigenstructure for high dimensional spiked covariance
- Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees
- Optimality of spectral clustering in the Gaussian mixture model
- Heteroskedastic PCA: algorithm, optimality, and applications
- Iterative algorithm for discrete structure recovery
- Entrywise eigenvector analysis of random matrices with low expected rank
- Hanson-Wright inequality in Hilbert spaces with application to \(K\)-means clustering for non-Euclidean data
- Semicircle law on short scales and delocalization of eigenvectors for Wigner random matrices
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Partial recovery bounds for clustering with the relaxed \(K\)-means
- The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics
- Statistical properties of kernel principal component analysis
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- Exact matrix completion via convex optimization
- Concentration of kernel matrices with application to kernel spectral clustering
- Superconcentration and Related Topics
- Perturbation of Linear Forms of Singular Vectors Under Gaussian Noise
- Exact Recovery in the Stochastic Block Model
- Relax, No Need to Round
- Community Detection and Stochastic Block Models
- Clustering subgaussian mixtures by semidefinite programming
- Near-Optimal Bounds for Phase Synchronization
- Least squares quantization in PCM
- Spectral Algorithms for Tensor Completion
- Covariate Regularized Community Detection in Sparse Graphs
- Cutoff for Exact Recovery of Gaussian Mixture Models
- Improved Clustering Algorithms for the Bipartite Stochastic Block Model
- Uniform Bounds for Invariant Subspace Perturbations
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Spectral techniques applied to sparse random graphs
- Covariate-assisted spectral clustering
- The Rotation of Eigenvectors by a Perturbation. III
- Asymptotic Theory for Principal Component Analysis
- Perturbation bounds in connection with singular value decomposition
- Theory of Reproducing Kernels
- Estimating Mixed Memberships With Sharp Eigenvector Deviations
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item