Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations
DOI10.1016/J.JMVA.2011.09.002zbMATH Open1236.62065arXiv1503.04525OpenAlexW2060519623MaRDI QIDQ764487FDOQ764487
Makoto Aoshima, Kazuyoshi Yata
Publication date: 13 March 2012
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1503.04525
Recommendations
- Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
- Statistical inference for high-dimension, low-sample-size data
- PCA consistency in high dimension, low sample size context
- Effective methodologies for high-dimensional data
- Boundary behavior in high dimension, low sample size asymptotics of PCA
consistencydiscriminant analysisprincipal components analysiseigenvalue distributionHDLSSinverse matrixnoise reduction
Factor analysis and principal components; correspondence analysis (62H25) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Asymptotic distribution theory in statistics (62E20) Estimation in multivariate analysis (62H12) Eigenvalues, singular values, and eigenvectors (15A18) Set-valued maps in general topology (54C60)
Cites Work
- On the distribution of the largest eigenvalue in principal components analysis
- Eigenvalues of large sample covariance matrices of spiked population models
- Title not available (Why is that?)
- PCA consistency in high dimension, low sample size context
- Basic properties of strong mixing conditions. A survey and some open questions
- Multivariate Theory for Analyzing High Dimensional Data
- Convergence and prediction of principal component scores in high-dimensional settings
- Geometric Representation of High Dimension, Low Sample Size Data
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- The high-dimension, low-sample-size geometric representation holds under mild conditions
- On Strong Mixing Conditions for Stationary Gaussian Processes
- PCA Consistency for Non-Gaussian Data in High Dimension, Low Sample Size Context
- Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
- Intrinsic dimensionality estimation of high-dimension, low sample size data with \(D\)-asymptotics
- Comparison of Discrimination Methods for High Dimensional Data
- Minimum distance classification rules for high dimensional data
Cited In (47)
- Estimation of linear functional of large spectral density matrix and application to Whittle's approach
- A test of sphericity for high-dimensional data and its application for detection of divergently spiked noise
- Statistical inference for high-dimension, low-sample-size data
- Title not available (Why is that?)
- Two-Stage Procedures for High-Dimensional Data
- Limiting laws for divergent spiked eigenvalues and largest nonspiked eigenvalue of sample covariance matrices
- A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data
- PCA consistency for the power spiked model in high-dimensional settings
- A High-Dimensional Two-Sample Test for Non-Gaussian Data under a Strongly Spiked Eigenvalue Model
- Reconstruction of a high-dimensional low-rank matrix
- A survey of high dimension low sample size asymptotics
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models
- Asymptotic properties of the first principal component and equality tests of covariance matrices in high-dimension, low-sample-size context
- Polynomial whitening for high-dimensional data
- Using visual statistical inference to better understand random class separations in high dimension, low sample size data
- Location-invariant tests of homogeneity of large-dimensional covariance matrices
- Statistical inference under the strongly spiked eigenvalue model
- Asymptotics of hierarchical clustering for growing dimension
- Binary discrimination methods for high-dimensional data with a geometric representation
- Overview of object oriented data analysis
- Correlation tests for high-dimensional data using extended cross-data-matrix methodology
- Discussion on “Two-Stage Procedures for High-Dimensional Data” by Makoto Aoshima and Kazuyoshi Yata
- Robust PCA for high‐dimensional data based on characteristic transformation
- Perturbation theory for cross data matrix-based PCA
- Equality tests of high-dimensional covariance matrices under the strongly spiked eigenvalue model
- Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix
- Inference on high-dimensional mean vectors with fewer observations than the dimension
- Authors' Response
- Analysis of high-dimensional one group repeated measures designs
- Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings
- Geometric classifiers for high-dimensional noisy data
- CORRELATION MATRIX OF EQUI-CORRELATED NORMAL POPULATION: FLUCTUATION OF THE LARGEST EIGENVALUE, SCALING OF THE BULK EIGENVALUES, AND STOCK MARKET
- On asymptotic normality of cross data matrix-based PCA in high dimension low sample size
- Inference on high-dimensional mean vectors under the strongly spiked eigenvalue model
- Hypothesis tests for high-dimensional covariance structures
- Double data piling leads to perfect classification
- Intrinsic dimensionality estimation of high-dimension, low sample size data with \(D\)-asymptotics
- Semiparametric estimation of the high-dimensional elliptical distribution
- Equality tests of covariance matrices under a low-dimensional factor structure
- Test for high-dimensional outliers with principal component analysis
- Consistency of the objective general index in high-dimensional settings
- More about asymptotic properties of some binary classification methods for high dimensional data
- High-dimensional hypothesis testing for allometric extension model
- A classifier under the strongly spiked eigenvalue model in high-dimension, low-sample-size context
- Asymptotic independence of spiked eigenvalues and linear spectral statistics for large sample covariance matrices
- Title not available (Why is that?)
This page was built for publication: Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q764487)