Fast rate of convergence in high-dimensional linear discriminant analysis
From MaRDI portal
Publication:3021183
Abstract: This paper gives a theoretical analysis of high dimensional linear discrimination of Gaussian data. We study the excess risk of linear discriminant rules. We emphasis on the poor performances of standard procedures in the case when dimension p is larger than sample size n. The corresponding theoretical results are non asymptotic lower bounds. On the other hand, we propose two discrimination procedures based on dimensionality reduction and provide associated rates of convergence which can be O(log(p)/n) under sparsity assumptions. Finally all our results rely on a theorem that provides simple sharp relations between the excess risk and an estimation error associated to the geometric parameters defining the used discrimination rule.
Recommendations
- The Dantzig discriminant analysis with high dimensional data
- Discriminant analysis of high-dimensional data over limited samples
- High Dimensional Linear Discriminant Analysis: Optimality, Adaptive Algorithm and Missing Data
- L1 least squares for sparse high-dimensional LDA
- Discriminant analysis in small and large dimensions
Cites work
- Adapting to unknown sparsity by controlling the false discovery rate
- Adaptive covariance estimation of locally stationary processes
- Adaptive estimation of a quadratic functional by model selection.
- Class prediction by nearest shrunken centroids, with applications to DNA microarrays.
- Fast learning rates for plug-in classifiers
- Feature selection by higher criticism thresholding achieves the optimal phase diagram
- High-dimensional classification using features annealed independence rules
- Ideal spatial adaptation by wavelet shrinkage
- Innovated higher criticism for detecting sparse signals in correlated noise
- Minimax estimation via wavelet shrinkage
- Minimax nonparametric classification .I. Rates of convergence
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Modern statistical estimation via oracle inequalities
- Optimal aggregation of classifiers in statistical learning.
- Regularized estimation of large covariance matrices
- Risk bounds for statistical learning
- Semigroup stationary processes and spectral representation
- Smooth discrimination analysis
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Stochastic Deconvolution Over Groups
Cited in
(4)- Dynamic linear discriminant analysis in high dimensional space
- Consistency and convergence rate for nearest subspace classifier
- A theoretical contribution to the fast implementation of null linear discriminant analysis with random matrix multiplication
- On general matrix exponential discriminant analysis methods for high dimensionality reduction
This page was built for publication: Fast rate of convergence in high-dimensional linear discriminant analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3021183)