A U-classifier for high-dimensional data under non-normality
From MaRDI portal
Publication:1661350
DOI10.1016/J.JMVA.2018.05.008zbMATH Open1395.62146arXiv1608.00088OpenAlexW2962871961MaRDI QIDQ1661350FDOQ1661350
Authors: Y. Aharonov
Publication date: 16 August 2018
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Abstract: A classifier for two or more samples is proposed when the data are high-dimensional and the underlying distributions may be non-normal. The classifier is constructed as a linear combination of two easily computable and interpretable components, the -component and the -component. The -component is a linear combination of -statistics which are averages of bilinear forms of pairwise distinct vectors from two independent samples. The -component is the discriminant score and is a function of the projection of the -component on the observation to be classified. Combined, the two components constitute an inherently bias-adjusted classifier valid for high-dimensional data. The simplicity of the classifier helps conveniently study its properties, including its asymptotic normal limit, and extend it to multi-sample case. The classifier is linear but its linearity does not rest on the assumption of homoscedasticity. Probabilities of misclassification and asymptotic properties of their empirical versions are discussed in detail. Simulation results are used to show the accuracy of the proposed classifier for sample sizes as small as 5 or 7 and any large dimensions. Applications on real data sets are also demonstrated.
Full work available at URL: https://arxiv.org/abs/1608.00088
Recommendations
- Geometric classifier for multiclass, high-dimensional data
- A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data
- PCA consistency for the power spiked model in high-dimensional settings
- Minimum distance classification rules for high dimensional data
- Robust centroid based classification with minimum error rates for high dimension, low sample size data
Cites Work
- Penalized classification using Fisher's linear discriminant
- Approximation Theorems of Mathematical Statistics
- Asymptotic Statistics
- Title not available (Why is that?)
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- High-dimensional classification using features annealed independence rules
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- A direct estimation approach to sparse linear discriminant analysis
- Theoretical Measures of Relative Performance of Classifiers for High Dimensional Data with Small Sample Sizes
- Asymptotic Optimality of Sparse Linear Discriminant Analysis with Arbitrary Number of Classes
- Sparse linear discriminant analysis by thresholding for high dimensional data
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sparse Quadratic Discriminant Analysis For High Dimensional Data
- Title not available (Why is that?)
- Matrix mathematics. Theory, facts, and formulas
- Title not available (Why is that?)
- Tests for High-Dimensional Regression Coefficients With Factorial Designs
- Sparsifying the Fisher Linear Discriminant by Rotation
- A \(U\)-statistic approach for a high-dimensional two-sample mean testing problem under non-normality and Behrens-Fisher setting
- Title not available (Why is that?)
- A weak invariance principle for weighted \(U\)-statistics with varying kernels
- Decomposability of high-dimensional diversity measures: quasi-\(U\)-statistics, martingales and nonstandard asymptotics
- Bias-corrected diagonal discriminant rules for high-dimensional classification
- Geometric Classifier for Multiclass, High-Dimensional Data
- Scale adjustments for classifiers in high-dimensional, low sample size settings
- A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data
- A modified linear discriminant analysis for high-dimensional data
- The central limit theorem for degenerate variable \(U\)-statistics under dependence
- Location‐invariant Multi‐sample U‐tests for Covariance Matrices with Large Dimension
- Clustering and classification problems in genetics through U-statistics
- Location-invariant tests of homogeneity of large-dimensional covariance matrices
- Innovated interaction screening for high-dimensional nonlinear classification
Cited In (1)
Uses Software
This page was built for publication: A \(U\)-classifier for high-dimensional data under non-normality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1661350)