On two simple and effective procedures for high dimensional classification of general populations
From MaRDI portal
(Redirected from Publication:284189)
Abstract: In this paper, we generalize two criteria, the determinant-based and trace-based criteria proposed by Saranadasa (1993), to general populations for high dimensional classification. These two criteria compare some distances between a new observation and several different known groups. The determinant-based criterion performs well for correlated variables by integrating the covariance structure and is competitive to many other existing rules. The criterion however requires the measurement dimension be smaller than the sample size. The trace-based criterion in contrast, is an independence rule and effective in the "large dimension-small sample size" scenario. An appealing property of these two criteria is that their implementation is straightforward and there is no need for preliminary variable selection or use of turning parameters. Their asymptotic misclassification probabilities are derived using the theory of large dimensional random matrices. Their competitive performances are illustrated by intensive Monte Carlo experiments and a real data analysis.
Recommendations
- Non-parametric comparison and classification of two large-scale populations
- High breakdown estimation for multiple populations with applications to discriminant analysis
- Classifying two populations by Bayesian method and applications
- Neyman-Pearson classification under high-dimensional settings
- Stein's method in high dimensional classification and applications
- Characterizing the scale dimension of a high-dimensional classification problem
- Asymptotic expansion of the misclassification probabilities of D- and A- criteria for discrimination from two high dimensional populations using the theory of large dimensional random matrices
- Asymptotic probabilities of misclassification of two discriminant functions in cases of high dimensional data
- Classification and Generalized Principal Component Analysis
Cites work
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- scientific article; zbMATH DE number 889593 (Why is no real title available?)
- A road to classification in high dimensional space: the regularized optimal affine discriminant
- Asymptotic expansion of the misclassification probabilities of D- and A- criteria for discrimination from two high dimensional populations using the theory of large dimensional random matrices
- Asymptotic probabilities of misclassification of two discriminant functions in cases of high dimensional data
- Discriminant analysis of multivariate repeated measures data with Kronecker product structured covariance matrices
- Enhancement of the applicability of Markowitz's portfolio optimization by utilizing random matrix theory
- Error rates in classification consisting of discrete and continuous variables in the presence of covariates
- Fast nonparametric classification based on data depth
- High-dimensional classification using features annealed independence rules
- Regularized linear discriminant analysis and its application in microarrays
- Some tests for the covariance matrix with fewer observations than the dimension under non-normality
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Sparse linear discriminant analysis by thresholding for high dimensional data
- Spectral analysis of large dimensional random matrices
- Tests for high-dimensional covariance matrices
- Two sample tests for high-dimensional covariance matrices
Cited in
(5)- Two-group classification with high-dimensional correlated data: a factor model approach
- Test on the linear combinations of covariance matrices in high-dimensional data
- On the dimension effect of regularized linear discriminant analysis
- High-dimensional linear models: a random matrix perspective
- Asymptotic expansion of the misclassification probabilities of D- and A- criteria for discrimination from two high dimensional populations using the theory of large dimensional random matrices
This page was built for publication: On two simple and effective procedures for high dimensional classification of general populations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q284189)