Finite sample approximation results for principal component analysis: A matrix perturbation approach

From MaRDI portal
Publication:1000307

DOI10.1214/08-AOS618zbMath1168.62058arXiv0901.3245OpenAlexW3105364218MaRDI QIDQ1000307

Boaz Nadler

Publication date: 6 February 2009

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0901.3245



Related Items

Heteroskedastic PCA: algorithm, optimality, and applications, Wald Statistics in high-dimensional PCA, Principal components in linear mixed models with general bulk, Biwhitening Reveals the Rank of a Count Matrix, Optimally Weighted PCA for High-Dimensional Heteroscedastic Data, Statistical inference for principal components of spiked covariance matrices, Perturbation theory for cross data matrix-based PCA, On the principal components of sample covariance matrices, Sparse PCA-based on high-dimensional Itô processes with measurement errors, Lower bounds for invariant statistical models with applications to principal component analysis, Challenges for Panel Financial Analysis, Recent developments in high dimensional covariance estimation and its related issues, a review, Sparse principal component analysis and iterative thresholding, Asymptotic performance of PCA for high-dimensional heteroscedastic data, Multiscale geometric methods for data sets. I: Multiscale SVD, noise and curvature., The Impact of Measurement Error on Principal Component Analysis, Minimax bounds for sparse PCA with noisy high-dimensional data, Asymptotic power of sphericity tests for high-dimensional data, Efficient estimation of linear functionals of principal components, Optimal prediction in the linearly transformed spiked model, Optimal detection of sparse principal components in high dimension, Reconstruction of a low-rank matrix in the presence of Gaussian noise, A note on the prediction error of principal component regression in high dimensions, Matrix means and a novel high-dimensional shrinkage phenomenon, Convergence and prediction of principal component scores in high-dimensional settings, Random perturbation of low rank matrices: improving classical bounds, Anisotropic diffusion on sub-manifolds with application to Earth structure classification, On the distribution of an arbitrary subset of the eigenvalues for some finite dimensional random matrices, Fundamental limits of detection in the spiked Wigner model, Nonasymptotic upper bounds for the reconstruction error of PCA, Limiting laws for divergent spiked eigenvalues and largest nonspiked eigenvalue of sample covariance matrices, Fast randomized numerical rank estimation for numerically low-rank matrices, The limiting spectral distribution of large-dimensional general information-plus-noise-type matrices, Unnamed Item, A CLT for the LSS of large-dimensional sample covariance matrices with diverging spikes, Regression on manifolds: estimation of the exterior derivative, Video denoising via empirical Bayesian estimation of space-time patches, Boundary behavior in high dimension, low sample size asymptotics of PCA, The singular values and vectors of low rank perturbations of large rectangular random matrices, Minimax sparse principal subspace estimation in high dimensions, Sparse PCA: optimal rates and adaptive estimation, Permutation methods for factor analysis and PCA, Random matrix theory in statistics: a review, The spectral norm of random inner-product kernel matrices, Uniform Bounds for Invariant Subspace Perturbations, Statistical challenges of high-dimensional data, Local Linear Regression on Manifolds and Its Geometric Interpretation, The limits of the sample spiked eigenvalues for a high-dimensional generalized Fisher matrix and its applications, Near-optimal stochastic approximation for online principal component estimation, Perturbation of the eigenvectors of the graph Laplacian: application to image denoising, The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices, Sparse wavelet regression with multiple predictive curves, Eigenvectors and eigenvalues in a random subspace of a tensor product, Sparse Principal Component Analysis in Hilbert Space, Treelets -- an adaptive multi-scale basis for sparse unordered data, Bi-cross-validation for factor analysis, Factor analysis via components analysis, Distributed estimation of principal eigenspaces, Panel models with interactive effects, Is there an optimal forecast combination?, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Optimality and sub-optimality of PCA. I: Spiked random matrix models, A guide for sparse PCA: model comparison and applications, Random matrix theory and its applications, Consistency of the objective general index in high-dimensional settings, Edge statistics of large dimensional deformed rectangular matrices, The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics, An \({\ell_p}\) theory of PCA and spectral clustering, On the sample covariance matrix estimator of reduced effective rank population matrices, with applications to fPCA, Relative perturbation bounds with applications to empirical covariance operators, Do semidefinite relaxations solve sparse PCA up to the information limit?



Cites Work