Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
DOI10.1016/J.JMVA.2010.04.006zbMATH Open1203.62112OpenAlexW2060651165MaRDI QIDQ990890FDOQ990890
Authors: Kazuyoshi Yata, Makoto Aoshima
Publication date: 1 September 2010
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2010.04.006
Recommendations
- Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations
- On asymptotic normality of cross data matrix-based PCA in high dimension low sample size
- PCA consistency in high dimension, low sample size context
- PCA Consistency for Non-Gaussian Data in High Dimension, Low Sample Size Context
- Consistency of sparse PCA in high dimension, low sample size contexts
- Robust PCA for high-dimensional data
- Fast cross-validation of high-breakdown resampling methods for PCA
- Perturbation theory for cross data matrix-based PCA
consistencyprincipal component analysismixture modeleigenvalue distributionsingular valueHDLSSmicroarray data analysis
Factor analysis and principal components; correspondence analysis (62H25) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Asymptotic distribution theory in statistics (62E20) Eigenvalues, singular values, and eigenvectors (15A18)
Cites Work
- On the distribution of the largest eigenvalue in principal components analysis
- Eigenvalues of large sample covariance matrices of spiked population models
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- PCA consistency in high dimension, low sample size context
- Geometric Representation of High Dimension, Low Sample Size Data
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- The high-dimension, low-sample-size geometric representation holds under mild conditions
- PCA Consistency for Non-Gaussian Data in High Dimension, Low Sample Size Context
- Intrinsic dimensionality estimation of high-dimension, low sample size data with \(D\)-asymptotics
Cited In (33)
- Sparse-smooth regularized singular value decomposition
- A test of sphericity for high-dimensional data and its application for detection of divergently spiked noise
- Statistical inference for high-dimension, low-sample-size data
- Authors' response
- A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data
- PCA consistency for the power spiked model in high-dimensional settings
- A High-Dimensional Two-Sample Test for Non-Gaussian Data under a Strongly Spiked Eigenvalue Model
- A survey of high dimension low sample size asymptotics
- Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models
- Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations
- Asymptotic properties of the first principal component and equality tests of covariance matrices in high-dimension, low-sample-size context
- Boundary behavior in high dimension, low sample size asymptotics of PCA
- Polynomial whitening for high-dimensional data
- Statistical inference under the strongly spiked eigenvalue model
- High-dimensional inference on covariance structures via the extended cross-data-matrix methodology
- Correlation tests for high-dimensional data using extended cross-data-matrix methodology
- Perturbation theory for cross data matrix-based PCA
- High dimension low sample size asymptotics of robust PCA
- Projection pursuit via white noise matrices
- Equality tests of high-dimensional covariance matrices under the strongly spiked eigenvalue model
- Inference on high-dimensional mean vectors with fewer observations than the dimension
- Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings
- Geometric classifiers for high-dimensional noisy data
- CORRELATION MATRIX OF EQUI-CORRELATED NORMAL POPULATION: FLUCTUATION OF THE LARGEST EIGENVALUE, SCALING OF THE BULK EIGENVALUES, AND STOCK MARKET
- Discussion on ``Two-stage procedures for high-dimensional data by Makoto Aoshima and Kazuyoshi Yata
- On asymptotic normality of cross data matrix-based PCA in high dimension low sample size
- Inference on high-dimensional mean vectors under the strongly spiked eigenvalue model
- Two-stage procedures for high-dimensional data
- Hypothesis tests for high-dimensional covariance structures
- Test for high-dimensional outliers with principal component analysis
- A singular value decomposition of a \(k\)-way array for a principal component analysis of multiway data, \(\text{PTA-}k\)
- Asymptotic normality for inference on multisample, high-dimensional mean vectors under mild conditions
- Title not available (Why is that?)
This page was built for publication: Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q990890)