Long random matrices and tensor unfolding
From MaRDI portal
Publication:6180391
Abstract: In this paper, we consider the singular values and singular vectors of low rank perturbations of large rectangular random matrices, in the regime the matrix is "long": we allow the number of rows (columns) to grow polynomially in the number of columns (rows). We prove there exists a critical signal-to-noise ratio (depending on the dimensions of the matrix), and the extreme singular values and singular vectors exhibit a BBP type phase transition. As a main application, we investigate the tensor unfolding algorithm for the asymmetric rank-one spiked tensor model, and obtain an exact threshold, which is independent of the procedure of tensor unfolding. If the signal-to-noise ratio is above the threshold, tensor unfolding detects the signals; otherwise, it fails to capture the signals.
Recommendations
Cites work
- scientific article; zbMATH DE number 7415122 (Why is no real title available?)
- A Multilinear Singular Value Decomposition
- Algorithmic thresholds for tensor PCA
- An optimal statistical and computational framework for generalized tensor estimation
- Asymptotic power of sphericity tests for high-dimensional data
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- Eigenvalues of large sample covariance matrices of spiked population models
- Eigenvector distribution in the critical regime of BBP transition
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Fluctuations of the extreme eigenvalues of finite rank deformations of random matrices
- How to iron out rough landscapes and get optimal performances: averaged gradient descent and its application to tensor PCA
- Isotropic local laws for sample covariance and generalized Wigner matrices
- Mesoscopic perturbations of large random matrices
- Minimax bounds for sparse PCA with noisy high-dimensional data
- Minimax sparse principal subspace estimation in high dimensions
- Nonlinear shrinkage estimation of large-dimensional covariance matrices
- Notes on computational-to-statistical gaps: predictions using statistical physics
- On consistency and sparsity for principal components analysis in high dimensions
- On sample eigenvalues in a generalized spiked population model
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- On the distribution of the largest eigenvalue in principal components analysis
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Optimal shrinkage of eigenvalues in the spiked covariance model
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
- Phase transition in random tensors with multiple independent spikes
- Phase transition in the spiked random tensor with Rademacher prior
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- Sparse PCA: optimal rates and adaptive estimation
- Sparse principal component analysis and iterative thresholding
- Spectrum estimation for large dimensional covariance matrices using random matrix theory
- Spiked singular values and vectors under extreme aspect ratios
- Statistical limits of spiked tensor models
- Statistical thresholds for tensor PCA
- Tensor Decomposition for Signal Processing and Machine Learning
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Tensor SVD: Statistical and Computational Limits
- Tensor clustering with planted structures: statistical optimality and computational limits
- Tensor spaces and numerical tensor calculus
- The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices
- The landscape of the spiked tensor model
- The singular values and vectors of low rank perturbations of large rectangular random matrices
This page was built for publication: Long random matrices and tensor unfolding
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6180391)