Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
From MaRDI portal
Publication:1763097
DOI10.3150/bj/1106314847zbMath1064.62073OpenAlexW2118250684WikidataQ105584438 ScholiaQ105584438MaRDI QIDQ1763097
Elizaveta Levina, Peter J. Bickel
Publication date: 21 February 2005
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3150/bj/1106314847
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15) Inference from stochastic processes and spectral analysis (62M15)
Related Items
The sparse group lasso for high-dimensional integrative linear discriminant analysis with application to alzheimer's disease prediction, A Cholesky-based estimation for large-dimensional covariance matrices, Integrative genetic risk prediction using non‐parametric empirical Bayes classification, A resample-replace lasso procedure for combining high-dimensional markers with limit of detection, Tilting Methods for Assessing the Influence of Components in a Classifier, A DC Programming Approach for Sparse Linear Discriminant Analysis, Variance ratio screening for ultrahigh dimensional discriminant analysis, A classifier under the strongly spiked eigenvalue model in high-dimension, low-sample-size context, Statistical inference for high-dimension, low-sample-size data, Robust support vector machine for high-dimensional imbalanced data, Sparse subspace linear discriminant analysis, High dimensional binary classification under label shift: phase transition and regularization, An Efficient Convex Formulation for Reduced-Rank Linear Discriminant Analysis in High Dimensions, An Efficient Greedy Search Algorithm for High-Dimensional Linear Discriminant Analysis, Resolution of Degeneracy in Merton's Portfolio Problem, Interpoint Distance Classification of High Dimensional Discrete Observations, Nonstationary Gaussian Process Discriminant Analysis With Variable Selection for High-Dimensional Functional Data, Fast Bayesian variable screenings for binary response regressions with small sample size, A Normality Test for High-dimensional Data Based on the Nearest Neighbor Approach, An improved modified cholesky decomposition approach for precision matrix estimation, Block-diagonal precision matrix regularization for ultra-high dimensional data, Variable Selection for Model-Based High-Dimensional Clustering and Its Application to Microarray Data, Unnamed Item, Unnamed Item, Unnamed Item, A DC Programming Approach for Sparse Estimation of a Covariance Matrix, Higher criticism thresholding: Optimal feature selection when useful features are rare and weak, Sparse overlapped linear discriminant analysis, Polynomial whitening for high-dimensional data, Regularized Linear Programming Discriminant Rule with Folded Concave Penalty for Ultrahigh-Dimensional Data, A Pairwise Hotelling Method for Testing High-Dimensional Mean Vectors, Multiclass sparse discriminant analysis incorporating graphical structure among predictors, Kernel naive Bayes discrimination for high‐dimensional pattern recognition, Discriminant analysis in small and large dimensions, Unnamed Item, A Doubly Enhanced EM Algorithm for Model-Based Tensor Clustering, Facilitating high‐dimensional transparent classification via empirical Bayes variable selection, Learning from a lot: Empirical Bayes for high‐dimensional model‐based prediction, Simultaneous variable selection and class fusion for high-dimensional linear discriminant analysis, Feature selection by higher criticism thresholding achieves the optimal phase diagram, Penalized Independence Rule for Testing High-Dimensional Hypotheses, Fast rate of convergence in high-dimensional linear discriminant analysis, The Dantzig Discriminant Analysis with High Dimensional Data, Classification Error of the Thresholded Independence Rule, Unnamed Item, Pairwise Variable Selection for High-Dimensional Model-Based Clustering, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Bias-Corrected Diagonal Discriminant Rules for High-Dimensional Classification, Tests for a Multiple-Sample Problem in High Dimensions, On the limits of clustering in high dimensions via cost functions, Sparse linear discriminant analysis in structured covariates space, SURE estimates for high dimensional classification, Weighted linear programming discriminant analysis for high‐dimensional binary classification, Correlated Component Regression: Re-thinking Regression in the Presence of Near Collinearity, Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods, Diagonal Discriminant Analysis With Feature Selection for High-Dimensional Data, Some Statistical Problems with High Dimensional Financial data, Batch Effects Correction with Unknown Subtypes, Dimension Estimation Using Random Connection Models, Banded Regularization of Autocovariance Matrices in Application to Parameter Estimation and Forecasting of Time Series, Penalized Classification using Fisher’s Linear Discriminant, A Geometric Perspective on the Power of Principal Component Association Tests in Multiple Phenotype Studies, Shrinkage‐based Diagonal Discriminant Analysis and Its Applications in High‐Dimensional Data, Sure Independence Screening for Ultrahigh Dimensional Feature Space, A simple model‐based approach to variable selection in classification and clustering, On rank distribution classifiers for high-dimensional data, Tests for mean vectors in high dimension, Modified linear discriminant analysis using block covariance matrix in high-dimensional data, High-Dimensional Data Classification, A convex optimization approach to high-dimensional sparse quadratic discriminant analysis, Improved second order estimation in the singular multivariate normal model, High dimensional discrimination analysis via a semiparametric model, TPRM: tensor partition regression models with applications in imaging biomarker detection, Discriminant analysis with Gaussian graphical tree models, A linear programming model for selection of sparse high-dimensional multiperiod portfolios, On two simple and effective procedures for high dimensional classification of general populations, A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions, Stein's method in high dimensional classification and applications, An adaptive decorrelation procedure for signal detection, New hard-thresholding rules based on data splitting in high-dimensional imbalanced classification, Stability of feature selection in classification issues for high-dimensional correlated data, Moments and root-mean-square error of the Bayesian MMSE estimator of classification error in the Gaussian model, Discriminant analysis on high dimensional Gaussian copula model, Simultaneous predictive Gaussian classifiers, Data mining in electronic commerce, Persistence of plug-in rule in classification of high dimensional multivariate binary data, Recipes for sparse LDA of horizontal data, High dimensional covariance matrix estimation by penalizing the matrix-logarithm transformed likelihood, General sparse multi-class linear discriminant analysis, High dimensional classifiers in the imbalanced case, A \(U\)-classifier for high-dimensional data under non-normality, Network linear discriminant analysis, Continuum directions for supervised dimension reduction, Sparse HDLSS discrimination with constrained data piling, Classification accuracy as a proxy for two-sample testing, Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint, Overlap in observational studies with high-dimensional covariates, Optimal functional supervised classification with separation condition, Regularization in statistics, PCA consistency for the power spiked model in high-dimensional settings, Optimal variable selection in multi-group sparse discriminant analysis, Feature selection when there are many influential features, Graph-based sparse linear discriminant analysis for high-dimensional classification, A selective overview of feature screening for ultrahigh-dimensional data, On the border of extreme and mild spiked models in the HDLSS framework, Distribution-free high-dimensional two-sample tests based on discriminating hyperplanes, Variable selection in linear mixed effects models, Variational discriminant analysis with variable selection, A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification, Penalized model-based clustering, Shrinkage-based diagonal Hotelling's tests for high-dimensional small sample size data, Shrinkage-based regularization tests for high-dimensional data with application to gene set analysis, Linear classifiers are nearly optimal when hidden variables have diverse effects, A guided random walk through some high dimensional problems, Sparse permutation invariant covariance estimation, High dimensional sparse covariance estimation via directed acyclic graphs, Optimal classification in sparse Gaussian graphic model, Pattern recognition based on canonical correlations in a high dimension low sample size context, High dimensional classification for spatially dependent data with application to neuroimaging, Covariance estimation: the GLM and regularization perspectives, Covariance and precision matrix estimation for high-dimensional time series, Beyond HC: more sensitive tests for rare/weak alternatives, An extension of Fisher's discriminant analysis for stochastic processes, Simultaneous variable selection and class fusion with penalized distance criterion based classifiers, Does modeling lead to more accurate classification? A study of relative efficiency in linear classification, Random matrix theory in statistics: a review, Variable selection in discriminant analysis based on the location model for mixed variables, Bayesian variable selection for high dimensional generalized linear models: convergence rates of the fitted densities, A multiple testing approach to the regularisation of large sample correlation matrices, Canonical correlation for stochastic processes, Asymptotic results in canonical discriminant analysis when the dimension is large compared to the sample size, The delta method for analytic functions of random operators with application to functional data, High-dimensional asymptotics of prediction: ridge regression and classification, Robust rank screening for ultrahigh dimensional discriminant analysis, Regularization through variable selection and conditional MLE with application to classification in high dimensions, Regularized estimation of large covariance matrices, A method for selecting the relevant dimensions for high-dimensional classification in singular vector spaces, A robust unified approach to analyzing methylation and gene expression data, Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models, Sparse estimation of large covariance matrices via a nested Lasso penalty, Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization, Regularized principal components of heritability, Variational nonparametric discriminant analysis, Classification with many classes: challenges and pluses, Sparse linear discriminant analysis by thresholding for high dimensional data, Sparse Bayesian hierarchical modeling of high-dimensional clustering problems, Feature selection in omics prediction problems using cat scores and false nondiscovery rate control, Alternating direction method of multipliers for penalized zero-variance discriminant analysis, Two-group classification with high-dimensional correlated data: a factor model approach, On the dimension effect of regularized linear discriminant analysis, Covariate-Adjusted Tensor Classification in High-Dimensions, Covariance regularization by thresholding, High-dimensional classification using features annealed independence rules, Spectrum estimation for large dimensional covariance matrices using random matrix theory, High dimensional covariance matrix estimation using multi-factor models from incomplete information, Generalized alternating direction method of multipliers: new theoretical insights and applications, Ensemble quantile classifier, Optimal feature selection for sparse linear discriminant analysis and its applications in gene expression data, Conditional probability estimation based classification with class label missing at random, A Penalized Likelihood Method for Classification With Matrix-Valued Predictors, Dynamic linear discriminant analysis in high dimensional space, A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data, High-dimensional linear models: a random matrix perspective, Geometric classifiers for high-dimensional noisy data, Classification and clustering of sequencing data using a Poisson model, Large-sample properties of unsupervised estimation of the linear discriminant using projection pursuit, Independent rule in classification of multivariate binary data, Varying coefficient linear discriminant analysis for dynamic data, Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions
Cites Work
- On the optimality of the simple Bayesian classifier under zero-one loss
- On positive spectral density functions
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Behavioral and prescriptive explanations of a reverse sunk cost effect
- Lectures on operator theory and its applications, Fields Institute for Research in Mathematical Sciences, Toronto, Canada 1994
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item