Sparse quadratic classification rules via linear dimension reduction
From MaRDI portal
Publication:6032761
DOI10.1016/j.jmva.2018.09.011zbMath1409.62128arXiv1711.04817OpenAlexW2769930800WikidataQ92145419 ScholiaQ92145419MaRDI QIDQ6032761
Irina Gaynanova, Tianying Wang
Publication date: 4 January 2019
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.04817
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15) Convex programming (90C25) Empirical decision procedures; empirical Bayes procedures (62C12)
Related Items (3)
Quadratic discriminant analysis by projection ⋮ Unnamed Item ⋮ Prediction and estimation consistency of sparse multi-class penalized optimal scoring
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A tail inequality for quadratic forms of subgaussian random vectors
- Optimal variable selection in multi-group sparse discriminant analysis
- Generalized Bayes estimators of a normal discriminant function
- Class prediction by nearest shrunken centroids, with applications to DNA microarrays.
- Adaptive estimation of a quadratic functional by model selection.
- Support union recovery in high-dimensional multivariate regression
- Standardization and the Group Lasso Penalty
- ggplot2
- Optimal Feature Selection in High-Dimensional Discriminant Analysis
- Penalized Classification using Fisher’s Linear Discriminant
- Joint estimation of multiple graphical models
- Divergence measures based on the Shannon entropy
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- Sparse Quadratic Discriminant Analysis For High Dimensional Data
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- A New Reduced-Rank Linear Discriminant Analysis Method and Its Applications
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- Model Selection and Estimation in Regression with Grouped Variables
- On the best finite set of linear observables for discriminating two Gaussian signals
- An Application of Information Theory to Multivariate Analysis
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- A selective review of group selection in high-dimensional models
This page was built for publication: Sparse quadratic classification rules via linear dimension reduction