PCA consistency for the power spiked model in high-dimensional settings
DOI10.1016/j.jmva.2013.08.003zbMath1280.62072arXiv1503.04549OpenAlexW1969057389MaRDI QIDQ391897
Makoto Aoshima, Kazuyoshi Yata
Publication date: 13 January 2014
Published in: Journal of Multivariate Analysis, Methodology and Computing in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1503.04549
asymptotic normalityheterogeneityfeature selectionmicroarray dataHDLSSnoise-reduction methodologylarge \(p\) small \(n\)cross-data-matrix methodologylarge p small nBayes error rate
Multivariate distribution of statistics (62H10) Factor analysis and principal components; correspondence analysis (62H25) Estimation in multivariate analysis (62H12) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Asymptotic distribution of eigenvalues, asymptotic theory of eigenfunctions for ordinary differential operators (34L20)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Distance-Weighted Discrimination
- Higher criticism for large-scale inference, especially for rare and weak effects
- Correlation tests for high-dimensional data using extended cross-data-matrix methodology
- Boundary behavior in high dimension, low sample size asymptotics of PCA
- Sparse linear discriminant analysis by thresholding for high dimensional data
- Convergence and prediction of principal component scores in high-dimensional settings
- A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data
- Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations
- Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
- Covariance regularization by thresholding
- High-dimensional classification using features annealed independence rules
- PCA consistency in high dimension, low sample size context
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- On the distribution of the largest eigenvalue in principal components analysis
- A two-sample test for high-dimensional data with applications to gene-set testing
- Regularized estimation of large covariance matrices
- Eigenvalues of large sample covariance matrices of spiked population models
- Asymptotic normality for inference on multisample, high-dimensional mean vectors under mild conditions
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- ESTIMATION OF SPIKED EIGENVALUES IN SPIKED MODELS
- Bias-Corrected Diagonal Discriminant Rules for High-Dimensional Classification
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Two-Stage Procedures for High-Dimensional Data
- Authors' Response
- Geometric Classifier for Multiclass, High-Dimensional Data
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- Sparse Quadratic Discriminant Analysis For High Dimensional Data
- Scale adjustments for classifiers in high-dimensional, low sample size settings
- PCA Consistency for Non-Gaussian Data in High Dimension, Low Sample Size Context
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Geometric Representation of High Dimension, Low Sample Size Data
- The high-dimension, low-sample-size geometric representation holds under mild conditions
- A Road to Classification in High Dimensional Space: The Regularized Optimal Affine Discriminant