High-dimensional classification using features annealed independence rules
From MaRDI portal
Publication:1000303
DOI10.1214/07-AOS504zbMath1360.62327arXivmath/0701108OpenAlexW2049701820WikidataQ41826462 ScholiaQ41826462MaRDI QIDQ1000303
Publication date: 6 February 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0701108
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Applications of statistics to biology and medical sciences; meta analysis (62P10)
Related Items (only showing first 100 items - show all)
TPRM: tensor partition regression models with applications in imaging biomarker detection ⋮ On selecting interacting features from high-dimensional data ⋮ On two simple and effective procedures for high dimensional classification of general populations ⋮ A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions ⋮ Stein's method in high dimensional classification and applications ⋮ Refined Cramér-type moderate deviation theorems for general self-normalized sums with applications to dependent random variables and winsorized mean ⋮ Self-semi-supervised clustering for large scale data with massive null group ⋮ A general framework for tensor screening through smoothing ⋮ On sufficient variable screening using log odds ratio filter ⋮ New hard-thresholding rules based on data splitting in high-dimensional imbalanced classification ⋮ Discriminant analysis on high dimensional Gaussian copula model ⋮ Self-normalization: taming a wild population in a heavy-tailed world ⋮ The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data ⋮ Tilting Methods for Assessing the Influence of Components in a Classifier ⋮ High dimensional classifiers in the imbalanced case ⋮ A \(U\)-classifier for high-dimensional data under non-normality ⋮ Network linear discriminant analysis ⋮ Continuum directions for supervised dimension reduction ⋮ Statistical inference for high-dimension, low-sample-size data ⋮ The cumulative Kolmogorov filter for model-free screening in ultrahigh dimensional data ⋮ Model free feature screening with dependent variable in ultrahigh dimensional binary classification ⋮ Sparse subspace linear discriminant analysis ⋮ Impacts of high dimensionality in finite samples ⋮ Model-free feature screening for ultrahigh dimensional classification ⋮ PCA consistency for the power spiked model in high-dimensional settings ⋮ Optimal variable selection in multi-group sparse discriminant analysis ⋮ Adjusted Pearson chi-square feature screening for multi-classification with ultrahigh dimensional data ⋮ A modified mean-variance feature-screening procedure for ultrahigh-dimensional discriminant analysis ⋮ Feature selection when there are many influential features ⋮ Graph-based sparse linear discriminant analysis for high-dimensional classification ⋮ A selective overview of feature screening for ultrahigh-dimensional data ⋮ Variable selection in linear mixed effects models ⋮ Variational discriminant analysis with variable selection ⋮ A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification ⋮ Shrinkage-based regularization tests for high-dimensional data with application to gene set analysis ⋮ On eigen-matrix translation method for classification of biological data ⋮ A guided random walk through some high dimensional problems ⋮ Robust composite weighted quantile screening for ultrahigh dimensional discriminant analysis ⋮ A majorization-minimization approach to variable selection using spike and slab priors ⋮ Optimal classification in sparse Gaussian graphic model ⋮ Pattern recognition based on canonical correlations in a high dimension low sample size context ⋮ High dimensional classification for spatially dependent data with application to neuroimaging ⋮ A model selection criterion for discriminant analysis of high-dimensional data with fewer observations ⋮ Facilitating high‐dimensional transparent classification via empirical Bayes variable selection ⋮ Simultaneous variable selection and class fusion with penalized distance criterion based classifiers ⋮ Random matrix theory in statistics: a review ⋮ Rank discriminants for predicting phenotypes from RNA expression ⋮ RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs ⋮ Feature selection by higher criticism thresholding achieves the optimal phase diagram ⋮ Feature filter for estimating central mean subspace and its sparse solution ⋮ Penalized Independence Rule for Testing High-Dimensional Hypotheses ⋮ Robust rank screening for ultrahigh dimensional discriminant analysis ⋮ Fast rate of convergence in high-dimensional linear discriminant analysis ⋮ Regularization through variable selection and conditional MLE with application to classification in high dimensions ⋮ On high-dimensional change point problem ⋮ Sure feature screening for high-dimensional dichotomous classification ⋮ High-Dimensional Sparse Additive Hazards Regression ⋮ A method for selecting the relevant dimensions for high-dimensional classification in singular vector spaces ⋮ High-dimensional mean estimation via \(\ell_1\) penalized normal likelihood ⋮ Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models ⋮ Sparse semiparametric discriminant analysis ⋮ Variational nonparametric discriminant analysis ⋮ Classification with many classes: challenges and pluses ⋮ Bayesian feature selection for classification with possibly large number of classes ⋮ Sparse linear discriminant analysis by thresholding for high dimensional data ⋮ A multivariate adaptive stochastic search method for dimensionality reduction in classification ⋮ Feature selection in omics prediction problems using cat scores and false nondiscovery rate control ⋮ Sure independence screening in generalized linear models with NP-dimensionality ⋮ Network exploration via the adaptive LASSO and SCAD penalties ⋮ Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis ⋮ Two-group classification with high-dimensional correlated data: a factor model approach ⋮ Model Selection for Classification with a Large Number of Classes ⋮ A unified approach to model selection and sparse recovery using regularized least squares ⋮ Covariate-Adjusted Tensor Classification in High-Dimensions ⋮ Impossibility of successful classification when useful features are rare and weak ⋮ The fused Kolmogorov filter: a nonparametric model-free screening method ⋮ QUADRO: a supervised dimension reduction method via Rayleigh quotient optimization ⋮ Generalized alternating direction method of multipliers: new theoretical insights and applications ⋮ Ensemble quantile classifier ⋮ Optimal feature selection for sparse linear discriminant analysis and its applications in gene expression data ⋮ Feature screening for ultrahigh-dimensional additive logistic models ⋮ Tests for a Multiple-Sample Problem in High Dimensions ⋮ Grouped feature screening for ultra-high dimensional data for the classification model ⋮ High-dimensional linear discriminant analysis using nonparametric methods ⋮ Geometric classifiers for high-dimensional noisy data ⋮ Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data ⋮ Entropy-based model-free feature screening for ultrahigh-dimensional multiclass classification ⋮ Bayesian discriminant analysis using a high dimensional predictor ⋮ Two-Stage Procedures for High-Dimensional Data ⋮ Dimension Estimation Using Random Connection Models ⋮ Penalized Classification using Fisher’s Linear Discriminant ⋮ Independent rule in classification of multivariate binary data ⋮ Unified mean-variance feature screening for ultrahigh-dimensional regression ⋮ Sure Independence Screening for Ultrahigh Dimensional Feature Space ⋮ Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space ⋮ Independence test for high dimensional data based on regularized canonical correlation coefficients ⋮ CLT for linear spectral statistics of normalized sample covariance matrices with the dimension much larger than the sample size ⋮ Asymptotic properties of high-dimensional random forests ⋮ Innovated interaction screening for high-dimensional nonlinear classification ⋮ Higher criticism for large-scale inference, especially for rare and weak effects
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Relaxed Lasso
- Dimension reduction strategies for analyzing global gene expression data with a response
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Weak convergence and empirical processes. With applications to statistics
- Test of Significance Based on Wavelet Thresholding and Neyman's Truncation
- To How Many Simultaneous Hypothesis Tests Can Normal, Student'stor Bootstrap Calibration Be Applied?
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Boosting With theL2Loss
- Moderate deviations for two sample t-statistics
- PLS Dimension Reduction for Classification with Microarray Data
- Prediction by Supervised Principal Components
This page was built for publication: High-dimensional classification using features annealed independence rules