An _ eigenvector perturbation bound and its application

From MaRDI portal
Publication:4558538

zbMATH Open1473.15015arXiv1603.03516MaRDI QIDQ4558538FDOQ4558538


Authors: Yiqiao Zhong, Jianqing Fan, Weichen Wang Edit this on Wikidata


Publication date: 22 November 2018

Abstract: In statistics and machine learning, people are often interested in the eigenvectors (or singular vectors) of certain matrices (e.g. covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. One usually employs Davis-Kahan sinheta theorem to bound the difference between the eigenvectors of a matrix A and those of a perturbed matrix widetildeA=A+E, in terms of ell2 norm. In this paper, we prove that when A is a low-rank and incoherent matrix, the ellinfty norm perturbation bound of singular vectors (or eigenvectors in the symmetric case) is smaller by a factor of sqrtd1 or sqrtd2 for left and right vectors, where d1 and d2 are the matrix dimensions. The power of this new perturbation result is shown in robust covariance estimation, particularly when random variables have heavy tails. There, we propose new robust covariance estimators and establish their asymptotic properties using the newly developed perturbation bound. Our theoretical results are verified through extensive numerical experiments.


Full work available at URL: https://arxiv.org/abs/1603.03516




Recommendations




Cites Work


Cited In (45)





This page was built for publication: An \(\ell_{\infty}\) eigenvector perturbation bound and its application

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4558538)