Sparse semiparametric discriminant analysis
From MaRDI portal
Publication:2256757
DOI10.1016/J.JMVA.2014.12.009zbMATH Open1307.62166arXiv1304.4983OpenAlexW1967806355MaRDI QIDQ2256757FDOQ2256757
Publication date: 20 February 2015
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Abstract: In recent years, a considerable amount of work has been devoted to generalizing linear discriminant analysis to overcome its incompetence for high-dimensional classification (Witten & Tibshirani 2011, Cai & Liu 2011, Mai et al. 2012, Fan et al. 2012). In this paper, we develop high-dimensional semiparametric sparse discriminant analysis (HD-SeSDA) that generalizes the normal-theory discriminant analysis in two ways: it relaxes the Gaussian assumptions and can handle non-polynomial (NP) dimension classification problems. If the underlying Bayes rule is sparse, HD-SeSDA can estimate the Bayes rule and select the true features simultaneously with overwhelming probability, as long as the logarithm of dimension grows slower than the cube root of sample size. Simulated and real examples are used to demonstrate the finite sample performance of HD-SeSDA. At the core of the theory is a new exponential concentration bound for semiparametric Gaussian copulas, which is of independent interest.
Full work available at URL: https://arxiv.org/abs/1304.4983
Recommendations
- High dimensional discrimination analysis via a semiparametric model
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- Multiclass sparse discriminant analysis
- The Dantzig discriminant analysis with high dimensional data
- Sparse linear discriminant analysis by thresholding for high dimensional data
Cites Work
- Penalized classification using Fisher's linear discriminant
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Efficient estimation in the bivariate normal copula model: Normal margins are least favourable
- High-dimensional semiparametric Gaussian copula graphical models
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional graphs and variable selection with the Lasso
- The nonparanormal: semiparametric estimation of high dimensional undirected graphs
- Restricted eigenvalue properties for correlated Gaussian designs
- Title not available (Why is that?)
- Multivariate Dispersion Models Generated From Gaussian Copula
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models
- High-dimensional classification using features annealed independence rules
- Adapting to unknown sparsity by controlling the false discovery rate
- Estimation of copula-based semiparametric time series models
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A direct estimation approach to sparse linear discriminant analysis
- Sparse linear discriminant analysis by thresholding for high dimensional data
- Semiparametric estimation in copula models
- Title not available (Why is that?)
- DALASS: variable selection in discriminant analysis via the LASSO
- Classifier technology and the illusion of progress
- Discriminant analysis through a semiparametric model
- Title not available (Why is that?)
- A Road to Classification in High Dimensional Space: The Regularized Optimal Affine Discriminant
- Efficient Estimation of Semiparametric Multivariate Copula Models
Cited In (20)
- An extreme-value approach for testing the equality of large U-statistic based correlation matrices
- Information criteria in classification: new divergence-based classifiers
- Title not available (Why is that?)
- Coordinatewise Gaussianization: Theories and Applications
- A distribution-free test of independence based on a modified mean variance index
- Sparse Orthogonal Linear Discriminant Analysis
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- Dynamic modeling and online monitoring of tensor data streams with application to passenger flow surveillance
- Multiclass sparse discriminant analysis incorporating graphical structure among predictors
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- Graph-based sparse linear discriminant analysis for high-dimensional classification
- Sparse discriminant analysis based on estimation of posterior probabilities
- Discriminant analysis through a semiparametric model
- Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis
- Linear discriminant analysis with sparse and dense signals
- Covariate-Adjusted Tensor Classification in High-Dimensions
- A convex optimization approach to high-dimensional sparse quadratic discriminant analysis
- Sparse linear discriminant analysis in structured covariates space
- Varying coefficient linear discriminant analysis for dynamic data
- Discussion of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
Uses Software
This page was built for publication: Sparse semiparametric discriminant analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2256757)