Asymptotic performance of PCA for high-dimensional heteroscedastic data
From MaRDI portal
Publication:1661372
DOI10.1016/j.jmva.2018.06.002zbMath1395.62139arXiv1703.06610OpenAlexW2734385411WikidataQ91692254 ScholiaQ91692254MaRDI QIDQ1661372
David Hong, Laura Balzano, Jeffrey A. Fessler
Publication date: 16 August 2018
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.06610
principal component analysisheteroscedasticityhigh-dimensional datasubspace estimationasymptotic random matrix theory
Asymptotic properties of parametric estimators (62F12) Factor analysis and principal components; correspondence analysis (62H25) Estimation in multivariate analysis (62H12)
Related Items
Heteroskedastic PCA: algorithm, optimality, and applications, On the non-asymptotic concentration of heteroskedastic Wishart-type matrix, Stochastic Gradients for Large-Scale Tensor Decomposition, Biwhitening Reveals the Rank of a Count Matrix, Optimally Weighted PCA for High-Dimensional Heteroscedastic Data, Asymptotic performance of PCA for high-dimensional heteroscedastic data, \textit{ScreeNOT}: exact MSE-optimal singular value thresholding in correlated noise, Rapid evaluation of the spectral signal detection threshold and Stieltjes transform, A note on identifiability conditions in confirmatory factor analysis, Matrix Denoising for Weighted Loss Functions and Heterogeneous Signals, Factor Extraction in Dynamic Factor Models: Kalman Filter Versus Principal Components
Uses Software
Cites Work
- Unnamed Item
- The singular values and vectors of low rank perturbations of large rectangular random matrices
- High breakdown estimators for principal components: the projection-pursuit approach revis\-ited
- On sample eigenvalues in a generalized spiked population model
- Strong convergence of the empirical distribution of eigenvalues of sample covariance matrices with a perturbation matrix
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- The polynomial method for random matrices
- Spectral analysis of large dimensional random matrices
- Asymptotic performance of PCA for high-dimensional heteroscedastic data
- On the distribution of the largest eigenvalue in principal components analysis
- Principal component analysis.
- Matrix estimation by universal singular value thresholding
- Robust computation of linear models by convex relaxation
- Recursive Robust PCA or Recursive Sparse Recovery in Large but Structured Noise
- OptShrink: An Algorithm for Improved Low-Rank Signal Matrix Denoising by Optimal, Data-Driven Singular Value Shrinkage
- Asymptotic Conditional Singular Value Decomposition for High-Dimensional Genomic Data
- Robust principal component analysis?
- Rank-Sparsity Incoherence for Matrix Decomposition
- Statistical challenges of high-dimensional data
- Robust Estimation of Dispersion Matrices and Principal Components
- Probabilistic Principal Component Analysis
- Statistical mechanics of unsupervised structure recognition
- Large Sample Covariance Matrices and High-Dimensional Data Analysis
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Robust PCA via Outlier Pursuit
- Robust Statistics