Feature selection by higher criticism thresholding achieves the optimal phase diagram
From MaRDI portal
Publication:3559955
Abstract: We consider two-class linear classification in a high-dimensional, low-sample size setting. Only a small fraction of the features are useful, the useful features are unknown to us, and each useful feature contributes weakly to the classification decision -- this setting was called the rare/weak model (RW Model). We select features by thresholding feature -scores. The threshold is set by {it higher criticism} (HC). Let denote the -value associated to the -th -score and denote the -th order statistic of the collection of -values. The HC threshold (HCT) is the order statistic of the -score corresponding to index maximizing . The ideal threshold optimizes the classification error. In cite{PNAS} we showed that HCT was numerically close to the ideal threshold. We formalize an asymptotic framework for studying the RW model, considering a sequence of problems with increasingly many features and relatively fewer observations. We show that along this sequence, the limiting performance of ideal HCT is essentially just as good as the limiting performance of ideal thresholding. Our results describe two-dimensional {it phase space}, a two-dimensional diagram with coordinates quantifying "rare" and "weak" in the RW model. Phase space can be partitioned into two regions -- one where ideal threshold classification is successful, and one where the features are so weak and so rare that it must fail. Surprisingly, the regions where ideal HCT succeeds and fails make the exact same partition of the phase diagram. Other threshold methods, such as FDR threshold selection, are successful in a substantially smaller region of the phase space than either HCT or Ideal thresholding.
Recommendations
- Higher criticism for large-scale inference, especially for rare and weak effects
- Rare and weak effects in large-scale inference: methods and phase diagrams
- Optimal classification in sparse Gaussian graphic model
- On false discovery rate thresholding for classification under sparsity
- Higher criticism for detecting sparse heterogeneous mixtures.
Cites work
- Adapting to unknown sparsity by controlling the false discovery rate
- Asymptotic minimaxity of false discovery rate thresholding for sparse exponential data
- Classification of sparse high-dimensional vectors
- Estimation and confidence sets for sparse normal mixtures
- Goodness-of-fit tests via phi-divergences
- High-dimensional classification using features annealed independence rules
- Higher criticism for detecting sparse heterogeneous mixtures.
- Higher criticism thresholding: Optimal feature selection when useful features are rare and weak
- Impossibility of successful classification when useful features are rare and weak
- Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences
- Properties of higher criticism under strong dependence
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
Cited in
(28)- Goodness of fit tests in terms of local levels with special emphasis on higher criticism tests
- Two-group classification with high-dimensional correlated data: a factor model approach
- Feature selection when there are many influential features
- Goodness-of-fit tests based on sup-functionals of weighted empirical processes
- Detection boundary in sparse regression
- High dimensional classifiers in the imbalanced case
- Optimal detection of heterogeneous and heteroscedastic mixtures
- Classification with many classes: challenges and pluses
- Higher criticism to compare two large frequency tables, with sensitivity to possible rare and weak differences
- Using visual statistical inference to better understand random class separations in high dimension, low sample size data
- Signal detection via Phi-divergences for general mixtures
- Estimating the amount of sparsity in two-point mixture models
- Signal localization: a new approach in signal discovery
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Optimal classification in sparse Gaussian graphic model
- The intermediates take it all: asymptotics of higher criticism statistics and a powerful alternative based on equal local levels
- The impossibility region for detecting sparse mixtures using the higher criticism
- Rare and weak effects in large-scale inference: methods and phase diagrams
- Innovated higher criticism for detecting sparse signals in correlated noise
- Classification of sparse high-dimensional vectors
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Sparse microwave imaging: principles and applications
- Adaptive threshold-based classification of sparse high-dimensional data
- Higher criticism for discriminating word-frequency tables and authorship attribution
- Higher criticism for large-scale inference, especially for rare and weak effects
- Identifying the support of rectangular signals in Gaussian noise
- Fast rate of convergence in high-dimensional linear discriminant analysis
- Asymptotics of goodness-of-fit tests based on minimum p-value statistics
This page was built for publication: Feature selection by higher criticism thresholding achieves the optimal phase diagram
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3559955)