Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics
From MaRDI portal
(Redirected from Publication:1747733)
Abstract: Perturbation bounds for singular spaces, in particular Wedin's theorem, are a fundamental tool in many fields including high-dimensional statistics, machine learning, and applied mathematics. In this paper, we establish separate perturbation bounds, measured in both spectral and Frobenius distances, for the left and right singular subspaces. Lower bounds, which show that the individual perturbation bounds are rate-optimal, are also given. The new perturbation bounds are applicable to a wide range of problems. In this paper, we consider in detail applications to low-rank matrix denoising and singular space estimation, high-dimensional clustering, and canonical correlation analysis (CCA). In particular, separate matching upper and lower bounds are obtained for estimating the left and right singular spaces. To the best of our knowledge, this is the first result that gives different optimal rates for the left and right singular spaces under the same perturbation. In addition to these problems, applications to other high-dimensional problems such as community detection in bipartite networks, multidimensional scaling, and cross-covariance matrix estimation are also discussed.
Recommendations
- The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics
- An \(\ell_{\infty}\) eigenvector perturbation bound and its application
- Random perturbation of low rank matrices: improving classical bounds
- Singular vectors under random perturbation
- Additive relative perturbation bounds of singular subspaces
Cites work
- scientific article; zbMATH DE number 1964693 (Why is no real title available?)
- scientific article; zbMATH DE number 6026126 (Why is no real title available?)
- A note on \(\sin\Theta\) theorems for singular subspace variations
- A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis
- Asymptotic theory for estimating the singular vectors and values of a partially-observed low rank matrix with noise
- Canonical Correlation Analysis: An Overview with Application to Learning Methods
- Consistency of spectral clustering
- Consistency of spectral clustering in stochastic block models
- Convex multi-task feature learning
- Exact matrix completion via convex optimization
- Influential features PCA for high dimensional clustering
- Interior-point method for nuclear norm approximation with application to system identification
- Large-scale simultaneous testing of cross-covariance matrices with applications to PheWAS
- Matrix completion from noisy entries
- Matrix estimation by universal singular value thresholding
- Minimax estimation in sparse canonical correlation analysis
- Minimax risk of matrix denoising by singular value thresholding
- Modern multidimensional scaling. Theory and applications.
- On consistency and sparsity for principal components analysis in high dimensions
- On the distribution of the left singular vectors of a random matrix and its applications
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Perturbation bounds in connection with singular value decomposition
- Perturbation of the SVD in the presence of small singular values
- Phase transitions for high dimensional clustering and related problems
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Random perturbation of low rank matrices: improving classical bounds
- Randomized Dimensionality Reduction for <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math></inline-formula>-Means Clustering
- Rate Optimal Denoising of Simultaneously Sparse and Low Rank Matrices
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Singular vector perturbation under Gaussian noise
- Singular vectors under random perturbation
- Sparse CCA via precision adjusted iterative thresholding
- Sparse CCA: adaptive estimation and computational barriers
- Sparse PCA: optimal rates and adaptive estimation
- Spectral clustering and the high-dimensional stochastic blockmodel
- The Optimal Hard Threshold for Singular Values is <inline-formula> <tex-math notation="TeX">\(4/\sqrt {3}\) </tex-math></inline-formula>
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- The Rotation of Eigenvectors by a Perturbation. III
- The largest eigenvalues of finite rank deformation of large Wigner matrices: Convergence and nonuniversality of the fluctuations
- The singular values and vectors of low rank perturbations of large rectangular random matrices
- Turning big data into tiny data: constant-size coresets for \(k\)-means, PCA and projective clustering
- Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
- Uniqueness of low-rank matrix completion by rigidity theory
Cited in
(45)- A Schatten-q low-rank matrix perturbation analysis via perturbation projection error bound
- Hybrid Kronecker Product Decomposition and Approximation
- scientific article; zbMATH DE number 7370563 (Why is no real title available?)
- A note on the orthogonal Procrustes problem and norm-dependent optimality
- One-dimensional tensor network recovery
- The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising
- Efficient kernel canonical correlation analysis using Nyström approximation
- Estimation of canonical correlation directions: from Gaussian to sub-Gaussian population
- Subspace perspective on canonical correlation analysis: dimension reduction and minimax rates
- Two lower bounds about singular subspaces
- On the non-asymptotic concentration of heteroskedastic Wishart-type matrix
- Normal approximation and confidence region of singular subspaces
- The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics
- Optimal estimation and computational limit of low-rank Gaussian mixtures
- Distributed Estimation for Principal Component Analysis: An Enlarged Eigenspace Analysis
- Perturbation of linear forms of singular vectors under Gaussian noise
- Estimation of misclassification rate in the Asymptotic Rare and Weak model with sub-Gaussian noises
- Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals
- An \(\ell_{\infty}\) eigenvector perturbation bound and its application
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- Latent Space Model for Higher-Order Networks and Generalized Tensor Decomposition
- Lower bounds for invariant statistical models with applications to principal component analysis
- Perturbation expansions and error bounds for the truncated singular value decomposition
- A stochastic perturbation analysis of the QR decomposition and its applications
- scientific article; zbMATH DE number 7415122 (Why is no real title available?)
- Angle-based joint and individual variation explained
- Rejoinder
- Optimal estimation for lower bound of the packing number
- Lower bound estimation for a family of high-dimensional sparse covariance matrices
- Perturbation upper bounds for singular subspaces with a kind of heteroskedastic noise and its application in clustering
- Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
- Optimal Permutation Recovery in Permuted Monotone Matrix Model
- Random perturbation of low rank matrices: improving classical bounds
- Statistical Inference for High-Dimensional Matrix-Variate Factor Models
- Euclidean Representation of Low-Rank Matrices and Its Geometric Properties
- Leave-one-out singular subspace perturbation analysis for spectral clustering
- Optimal sparse singular value decomposition for high-dimensional high-order data
- A Projected Subgradient Method for the Computation of Adapted Metrics for Dynamical Systems
- Heteroskedastic PCA: algorithm, optimality, and applications
- Covariance Estimation for Matrix-valued Data
- Singular vector and singular subspace distribution for the matrix denoising model
- Van Trees inequality, group equivariance, and estimation of principal subspaces
- An \({\ell_p}\) theory of PCA and spectral clustering
- Upper bound estimations of misclassification rate in the heteroscedastic clustering model with sub-Gaussian noises
- Inference for heteroskedastic PCA with missing data
This page was built for publication: Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1747733)