Sparse semiparametric discriminant analysis
From MaRDI portal
Publication:2256757
Abstract: In recent years, a considerable amount of work has been devoted to generalizing linear discriminant analysis to overcome its incompetence for high-dimensional classification (Witten & Tibshirani 2011, Cai & Liu 2011, Mai et al. 2012, Fan et al. 2012). In this paper, we develop high-dimensional semiparametric sparse discriminant analysis (HD-SeSDA) that generalizes the normal-theory discriminant analysis in two ways: it relaxes the Gaussian assumptions and can handle non-polynomial (NP) dimension classification problems. If the underlying Bayes rule is sparse, HD-SeSDA can estimate the Bayes rule and select the true features simultaneously with overwhelming probability, as long as the logarithm of dimension grows slower than the cube root of sample size. Simulated and real examples are used to demonstrate the finite sample performance of HD-SeSDA. At the core of the theory is a new exponential concentration bound for semiparametric Gaussian copulas, which is of independent interest.
Recommendations
- High dimensional discrimination analysis via a semiparametric model
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- Multiclass sparse discriminant analysis
- The Dantzig discriminant analysis with high dimensional data
- Sparse linear discriminant analysis by thresholding for high dimensional data
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 784362 (Why is no real title available?)
- scientific article; zbMATH DE number 845707 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A direct estimation approach to sparse linear discriminant analysis
- A road to classification in high dimensional space: the regularized optimal affine discriminant
- A unified approach to model selection and sparse recovery using regularized least squares
- Adapting to unknown sparsity by controlling the false discovery rate
- Classifier technology and the illusion of progress
- DALASS: variable selection in discriminant analysis via the LASSO
- Discriminant analysis through a semiparametric model
- Efficient Estimation of Semiparametric Multivariate Copula Models
- Efficient estimation in the bivariate normal copula model: Normal margins are least favourable
- Estimation of copula-based semiparametric time series models
- High-dimensional classification using features annealed independence rules
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional semiparametric Gaussian copula graphical models
- Least angle regression. (With discussion)
- Multivariate Dispersion Models Generated From Gaussian Copula
- Nearly unbiased variable selection under minimax concave penalty
- On the conditions used to prove oracle results for the Lasso
- Penalized classification using Fisher's linear discriminant
- Regularization and Variable Selection Via the Elastic Net
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models
- Restricted eigenvalue properties for correlated Gaussian designs
- Semiparametric estimation in copula models
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse linear discriminant analysis by thresholding for high dimensional data
- The Adaptive Lasso and Its Oracle Properties
- The nonparanormal: semiparametric estimation of high dimensional undirected graphs
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(22)- Discussion of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- High dimensional discrimination analysis via a semiparametric model
- An extreme-value approach for testing the equality of large U-statistic based correlation matrices
- Covariate-adjusted tensor classification in high dimensions
- Information criteria in classification: new divergence-based classifiers
- CODA: high dimensional copula discriminant analysis
- Coordinatewise Gaussianization: Theories and Applications
- A distribution-free test of independence based on a modified mean variance index
- Sparse Orthogonal Linear Discriminant Analysis
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- Dynamic modeling and online monitoring of tensor data streams with application to passenger flow surveillance
- Multiclass sparse discriminant analysis incorporating graphical structure among predictors
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- A direct approach for sparse quadratic discriminant analysis
- Graph-based sparse linear discriminant analysis for high-dimensional classification
- Sparse discriminant analysis based on estimation of posterior probabilities
- Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis
- Discriminant analysis through a semiparametric model
- A convex optimization approach to high-dimensional sparse quadratic discriminant analysis
- Linear discriminant analysis with sparse and dense signals
- Sparse linear discriminant analysis in structured covariates space
- Varying coefficient linear discriminant analysis for dynamic data
This page was built for publication: Sparse semiparametric discriminant analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2256757)