Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
From MaRDI portal
Publication:1426206
DOI10.1016/S0031-3203(03)00136-5zbMath1059.68101OpenAlexW2171428093MaRDI QIDQ1426206
Nicola L. C. Talbot, Gavin C. Cawley
Publication date: 14 March 2004
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0031-3203(03)00136-5
Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Related Items
Modified characteristics projection finite element method for time-dependent conduction-convection problems, Optimally regularised kernel Fisher discriminant classification, Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression, Pseudo-inverse linear discriminants for the improvement of overall classification accuracies, Advances in credit scoring: combining performance and interpretation in kernel discriminant analysis, Integrated Fisher linear discriminants: an empirical study, The leave-worst-\(k\)-out criterion for cross validation, On the Choice of the Kernel Function in Kernel Discriminant Analysis Using Information Complexity, A reformative kernel Fisher discriminant analysis, Additive regularization trade-off: fusion of training and validation levels in kernel methods, Sparse multinomial kernel discriminant analysis (sMKDA), A fast kernel-based nonlinear discriminant analysis for multi-class problems, Efficient approximate leave-one-out cross-validation for kernel logistic regression, Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation, HYPER-PARAMETER SELECTION FOR SPARSE LS-SVM VIA MINIMIZATION OF ITS LOCALIZED GENERALIZATION ERROR, Kernel learning at the first level of inference, Hierarchical clustered multiclass discriminant analysis via cross-validation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Support-vector networks
- Some results on Tchebycheffian spline functions and stochastic processes
- Updating the Inverse of a Matrix
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Adjustment of an Inverse Matrix Corresponding to a Change in One Element of a Given Matrix
- Theory of Reproducing Kernels
- An Inverse Matrix Adjustment Arising in Discriminant Analysis
- Soft margins for AdaBoost
- Choosing multiple parameters for support vector machines