High-dimensional classification using features annealed independence rules

From MaRDI portal
Publication:1000303

DOI10.1214/07-AOS504zbMath1360.62327arXivmath/0701108OpenAlexW2049701820WikidataQ41826462 ScholiaQ41826462MaRDI QIDQ1000303

Yingying Fan, Jianqing Fan

Publication date: 6 February 2009

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/math/0701108



Related Items

TPRM: tensor partition regression models with applications in imaging biomarker detection, On selecting interacting features from high-dimensional data, On two simple and effective procedures for high dimensional classification of general populations, A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions, Stein's method in high dimensional classification and applications, Refined Cramér-type moderate deviation theorems for general self-normalized sums with applications to dependent random variables and winsorized mean, Self-semi-supervised clustering for large scale data with massive null group, A general framework for tensor screening through smoothing, On sufficient variable screening using log odds ratio filter, New hard-thresholding rules based on data splitting in high-dimensional imbalanced classification, Discriminant analysis on high dimensional Gaussian copula model, Self-normalization: taming a wild population in a heavy-tailed world, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Tilting Methods for Assessing the Influence of Components in a Classifier, High dimensional classifiers in the imbalanced case, A \(U\)-classifier for high-dimensional data under non-normality, Network linear discriminant analysis, Continuum directions for supervised dimension reduction, Statistical inference for high-dimension, low-sample-size data, The cumulative Kolmogorov filter for model-free screening in ultrahigh dimensional data, Model free feature screening with dependent variable in ultrahigh dimensional binary classification, Sparse subspace linear discriminant analysis, Impacts of high dimensionality in finite samples, Model-free feature screening for ultrahigh dimensional classification, PCA consistency for the power spiked model in high-dimensional settings, Optimal variable selection in multi-group sparse discriminant analysis, Adjusted Pearson chi-square feature screening for multi-classification with ultrahigh dimensional data, A modified mean-variance feature-screening procedure for ultrahigh-dimensional discriminant analysis, Feature selection when there are many influential features, Graph-based sparse linear discriminant analysis for high-dimensional classification, A selective overview of feature screening for ultrahigh-dimensional data, Variable selection in linear mixed effects models, Variational discriminant analysis with variable selection, A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification, Shrinkage-based regularization tests for high-dimensional data with application to gene set analysis, On eigen-matrix translation method for classification of biological data, A guided random walk through some high dimensional problems, Robust composite weighted quantile screening for ultrahigh dimensional discriminant analysis, A majorization-minimization approach to variable selection using spike and slab priors, Optimal classification in sparse Gaussian graphic model, Pattern recognition based on canonical correlations in a high dimension low sample size context, High dimensional classification for spatially dependent data with application to neuroimaging, A model selection criterion for discriminant analysis of high-dimensional data with fewer observations, Facilitating high‐dimensional transparent classification via empirical Bayes variable selection, Simultaneous variable selection and class fusion with penalized distance criterion based classifiers, Random matrix theory in statistics: a review, Rank discriminants for predicting phenotypes from RNA expression, RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs, Feature selection by higher criticism thresholding achieves the optimal phase diagram, Feature filter for estimating central mean subspace and its sparse solution, Penalized Independence Rule for Testing High-Dimensional Hypotheses, Robust rank screening for ultrahigh dimensional discriminant analysis, Fast rate of convergence in high-dimensional linear discriminant analysis, Regularization through variable selection and conditional MLE with application to classification in high dimensions, On high-dimensional change point problem, Sure feature screening for high-dimensional dichotomous classification, High-Dimensional Sparse Additive Hazards Regression, A method for selecting the relevant dimensions for high-dimensional classification in singular vector spaces, High-dimensional mean estimation via \(\ell_1\) penalized normal likelihood, Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models, Sparse semiparametric discriminant analysis, Variational nonparametric discriminant analysis, Classification with many classes: challenges and pluses, Bayesian feature selection for classification with possibly large number of classes, Sparse linear discriminant analysis by thresholding for high dimensional data, A multivariate adaptive stochastic search method for dimensionality reduction in classification, Feature selection in omics prediction problems using cat scores and false nondiscovery rate control, Sure independence screening in generalized linear models with NP-dimensionality, Network exploration via the adaptive LASSO and SCAD penalties, Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis, Two-group classification with high-dimensional correlated data: a factor model approach, Model Selection for Classification with a Large Number of Classes, A unified approach to model selection and sparse recovery using regularized least squares, Covariate-Adjusted Tensor Classification in High-Dimensions, Impossibility of successful classification when useful features are rare and weak, The fused Kolmogorov filter: a nonparametric model-free screening method, QUADRO: a supervised dimension reduction method via Rayleigh quotient optimization, Generalized alternating direction method of multipliers: new theoretical insights and applications, Ensemble quantile classifier, Optimal feature selection for sparse linear discriminant analysis and its applications in gene expression data, Feature screening for ultrahigh-dimensional additive logistic models, Tests for a Multiple-Sample Problem in High Dimensions, Grouped feature screening for ultra-high dimensional data for the classification model, High-dimensional linear discriminant analysis using nonparametric methods, Geometric classifiers for high-dimensional noisy data, Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data, Entropy-based model-free feature screening for ultrahigh-dimensional multiclass classification, Bayesian discriminant analysis using a high dimensional predictor, Two-Stage Procedures for High-Dimensional Data, Dimension Estimation Using Random Connection Models, Penalized Classification using Fisher’s Linear Discriminant, Independent rule in classification of multivariate binary data, Unified mean-variance feature screening for ultrahigh-dimensional regression, Sure Independence Screening for Ultrahigh Dimensional Feature Space, Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space, Independence test for high dimensional data based on regularized canonical correlation coefficients, CLT for linear spectral statistics of normalized sample covariance matrices with the dimension much larger than the sample size, Asymptotic properties of high-dimensional random forests, Innovated interaction screening for high-dimensional nonlinear classification, Higher criticism for large-scale inference, especially for rare and weak effects, Covariate Information Number for Feature Screening in Ultrahigh-Dimensional Supervised Problems, The sparse group lasso for high-dimensional integrative linear discriminant analysis with application to alzheimer's disease prediction, On maximum depth classifiers: depth distribution approach, On Sure Screening with Multiple Responses, Consistent Screening Procedures in High-dimensional Binary Classification, Integrative genetic risk prediction using non‐parametric empirical Bayes classification, Prior Knowledge Guided Ultra-High Dimensional Variable Screening With Application to Neuroimaging Data, An efficient approach for discriminant analysis based on adaptive feature augmentation, Structured sparse support vector machine with ordered features, Variance ratio screening for ultrahigh dimensional discriminant analysis, High dimensional asymptotics for the naive Hotelling T2 statistic in pattern recognition, Unnamed Item, Unnamed Item, A p-value based dimensionality reduction test for high dimensional means, Classification by likelihood accordance functions, Nonstationary Gaussian Process Discriminant Analysis With Variable Selection for High-Dimensional Functional Data, Multiway Sparse Distance Weighted Discrimination, A Normality Test for High-dimensional Data Based on the Nearest Neighbor Approach, An improved modified cholesky decomposition approach for precision matrix estimation, Robust feature screening procedures for single and mixed types of data, Feature Screening for Interval-Valued Response with Application to Study Association between Posted Salary and Required Skills, Unnamed Item, Optimal discriminant analysis in high-dimensional latent factor models, Beyond Linear Dynamic Functional Connectivity: A Vine Copula Change Point Model, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, A dynamic screening algorithm for hierarchical binary marketing data, A Reproducing Kernel Hilbert Space Framework for Functional Classification, Regularized Linear Programming Discriminant Rule with Folded Concave Penalty for Ultrahigh-Dimensional Data, Coordinatewise Gaussianization: Theories and Applications, Multiclass sparse discriminant analysis incorporating graphical structure among predictors, Kernel naive Bayes discrimination for high‐dimensional pattern recognition, Unnamed Item, A Doubly Enhanced EM Algorithm for Model-Based Tensor Clustering, A Simple Two-Sample Test in High Dimensions Based on L2-Norm, IPAD: Stable Interpretable Forecasting with Knockoffs Inference, Diagonally Dominant Principal Component Analysis, The Dantzig Discriminant Analysis with High Dimensional Data, Classification Error of the Thresholded Independence Rule, The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising, Composite Coefficient of Determination and Its Application in Ultrahigh Dimensional Variable Screening, Thresholding-based outlier detection for high-dimensional data, SURE estimates for high dimensional classification, Weighted linear programming discriminant analysis for high‐dimensional binary classification, Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods, Variable selection of linear programming discriminant estimator, Robust Variable and Interaction Selection for Logistic Regression and General Index Models, On rank distribution classifiers for high-dimensional data, An improved banded estimation for large covariance matrix



Cites Work