Testing for principal component directions under weak identifiability
From MaRDI portal
Abstract: We consider the problem of testing, on the basis of a -variate Gaussian random sample, the null hypothesis against the alternative , where is the "first" eigenvector of the underlying covariance matrix and is a fixed unit -vector. In the classical setup where eigenvalues are fixed, the Anderson (1963) likelihood ratio test (LRT) and the Hallin, Paindaveine and Verdebout (2010) Le Cam optimal test for this problem are asymptotically equivalent under the null hypothesis, hence also under sequences of contiguous alternatives. We show that this equivalence does not survive asymptotic scenarios where with . For such scenarios, the Le Cam optimal test still asymptotically meets the nominal level constraint, whereas the LRT severely overrejects the null hypothesis. Consequently, the former test should be favored over the latter one whenever the two largest sample eigenvalues are close to each other. By relying on the Le Cam's asymptotic theory of statistical experiments, we study the non-null and optimality properties of the Le Cam optimal test in the aforementioned asymptotic scenarios and show that the null robustness of this test is not obtained at the expense of power. Our asymptotic investigation is extensive in the sense that it allows to converge to zero at an arbitrary rate. While we restrict to single-spiked spectra of the form to make our results as striking as possible, we extend our results to the more general elliptical case. Finally, we present an illustrative real data example.
Recommendations
- Optimal rank-based testing for principal components
- Asymptotic distribution of the LR statistic for equality of the smallest eigenvalues in high-dimensional principal component analysis
- Large sample approximations for the LR statistic for equality of the smallest eigenvalues of a covariance matrix under elliptical population
- Signal detection in high dimension: the multispiked case
- Optimal rank-based tests for common principal components
- An Approximation for the Test of the Equality of the Smallest Eigenvalues of a Covariance Matrix
- Optimal hypothesis testing for high dimensional covariance matrices
- Likelihood ratio tests for principal components
Cites work
- scientific article; zbMATH DE number 3780417 (Why is no real title available?)
- scientific article; zbMATH DE number 41813 (Why is no real title available?)
- scientific article; zbMATH DE number 49702 (Why is no real title available?)
- scientific article; zbMATH DE number 192992 (Why is no real title available?)
- scientific article; zbMATH DE number 1964693 (Why is no real title available?)
- A class of asymptotic tests for principal component vectors
- A partial overview of the theory of statistics with functional data
- Analysis of Multivariate and High-Dimensional Data
- Applied Multivariate Statistical Analysis
- Asymptotic Statistics
- Asymptotic Theory for Principal Component Analysis
- Asymptotic inference for eigenvectors
- Conditional inference for possibly unidentified structural equations
- Efficient R-estimation of principal and common principal components
- Exploring multivariate data with the forward search.
- Inference for eigenvalues and eigenvectors of Gaussian symmetric matrices
- Inference on the mode of weak directional signals: a Le Cam perspective on hypothesis testing near singularities
- Kernel-based functional principal components
- Lower Risk Bounds and Properties of Confidence Sets for Ill-Posed Estimation Problems with Applications to Spectral Density and Persistence Estimation, Unit Roots, and Estimation of Long Memory Parameters
- Monte Carlo tests with nuisance parameters: a general approach to finite-sample inference and nonstandard asymptotics
- Multivariate mode hunting: Data analytic tools with measures of significance
- New theory of discriminant analysis after R. Fisher. Advanced research by the feature selection method for microarray data
- On consistency and sparsity for principal components analysis in high dimensions
- Optimal detection of sparse principal components in high dimension
- Optimal rank-based testing for principal components
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- Principal Components Analysis Based on Multivariate MM Estimators With Fast and Robust Bootstrap
- Principal component analysis based on robust estimators of the covariance or correlation matrix: influence functions and efficiencies
- Robust Principal Component Analysis Based on Maximum Correntropy Criterion
- Robust functional principal components: a projection-pursuit approach
- Scale-invariant sparse PCA on high-dimensional meta-elliptical data
- Semiparametrically efficient rank-based inference for shape. I: optimal rank-based tests for sphericity
- Some Impossibility Theorems in Econometrics With Applications to Structural and Dynamic Models
- Testing for principal component directions under weak identifiability
Cited in
(8)- Inference on the mode of weak directional signals: a Le Cam perspective on hypothesis testing near singularities
- On the asymptotic behavior of the leading eigenvector of Tyler's shape estimator under weak identifiability
- Power enhancement for dimension detection of Gaussian signals
- Tests concerning two non-isotropic principal components
- On the power of axial tests of uniformity on spheres
- Testing for principal component directions under weak identifiability
- Detecting the direction of a signal on high-dimensional spheres: non-null and Le Cam optimality results
- Sign tests for weak principal directions
This page was built for publication: Testing for principal component directions under weak identifiability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2176623)