A study of regularized Gaussian classifier in high-dimension small sample set case based on MDL principle with application to spectrum recognition
From MaRDI portal
Publication:937427
DOI10.1016/j.patcog.2008.02.004zbMath1154.68485OpenAlexW2067355741MaRDI QIDQ937427
Ping Guo, Yunde Jia, Michael R. Lyu
Publication date: 15 August 2008
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2008.02.004
classificationminimum description lengthcovariance matrix estimationregularization parameter selectiondiscriminant analysis method
Uses Software
Cites Work
- Modeling by shortest data description
- Principal component analysis.
- Face recognition using direct, weighted linear discriminant analysis and modular subspaces
- Linear dimensionality reduction using relevance weighted LDA
- An efficient kernel discriminant analysis method
- Mixture Densities, Maximum Likelihood and the EM Algorithm
- Exact Minimax Strategies for Predictive Density Estimation, Data Compression, and Model Selection
- Regularized Gaussian Discriminant Analysis Through Eigenvalue Decomposition
- The minimum description length principle in coding and modeling
- Estimating the components of a mixture of normal distributions
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A study of regularized Gaussian classifier in high-dimension small sample set case based on MDL principle with application to spectrum recognition