Optimally regularised kernel Fisher discriminant classification
From MaRDI portal
Publication:2383046
Recommendations
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- Optimising kernel parameters and regularisation coefficients for non-linear discriminant analysis
- Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation
- Learning the optimal kernel for Fisher discriminant analysis via second order cone programming
- On the Choice of the Kernel Function in Kernel Discriminant Analysis Using Information Complexity
Cites work
- scientific article; zbMATH DE number 5957449 (Why is no real title available?)
- scientific article; zbMATH DE number 4054836 (Why is no real title available?)
- scientific article; zbMATH DE number 1928693 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 1391397 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- 10.1162/15324430260185619
- Additive regularization trade-off: fusion of training and validation levels in kernel methods
- Chaos control using least-squares support vector machines
- Choosing multiple parameters for support vector machines
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- Improved sparse least-squares support vector machines
- On a kernel-based method for pattern recognition, regression, approximation, and operator inversion
- Principal component analysis.
- Regularization algorithms for learning that are equivalent to multilayer networks
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Soft margins for AdaBoost
- Support-vector networks
- The elements of statistical learning. Data mining, inference, and prediction
- Updating the Inverse of a Matrix
- Weighted least squares support vector machines: robustness and sparse approximation
Cited in
(7)- Efficient approximate leave-one-out cross-validation for kernel logistic regression
- Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation
- Regularized SVM classification with a new complexity-driven stochastic optimizer
- Optimising kernel parameters and regularisation coefficients for non-linear discriminant analysis
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- Fisher-regularized support vector machine
- Mining competent case bases for case-based reasoning
This page was built for publication: Optimally regularised kernel Fisher discriminant classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2383046)