Sure feature screening for high-dimensional dichotomous classification
From MaRDI portal
Publication:525910
DOI10.1007/S11425-016-0117-3zbMATH Open1360.62081OpenAlexW2557486581MaRDI QIDQ525910FDOQ525910
Publication date: 5 May 2017
Published in: Science China. Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11425-016-0117-3
Statistical ranking and selection procedures (62F07) Reliability, availability, maintenance, inspection in operations research (90B25)
Cites Work
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Measuring and testing dependence by correlation of distances
- Weak convergence and empirical processes. With applications to statistics
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- High-dimensional classification using features annealed independence rules
- Probability Inequalities for Sums of Bounded Random Variables
- Convergence of stochastic processes
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- The Kolmogorov filter for variable screening in high-dimensional binary classification
- Robust rank correlation based screening
- Nonparametric feature screening
- Model-Free Feature Screening for Ultrahigh Dimensional Discriminant Analysis
- Tables of the cramér-von mises distributions
Cited In (3)
- Interaction identification and clique screening for classification with ultra-high dimensional discrete features
- Robust Feature Screening via Distance Correlation for Ultrahigh Dimensional Data With Responses Missing at Random
- Tournament screening cum EBIC for feature selection with high-dimensional feature spaces
This page was built for publication: Sure feature screening for high-dimensional dichotomous classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q525910)