Hyper-parameter selection for sparse LS-SVM via minimization of its localized generalization error
DOI10.1142/S0219691313500306zbMATH Open1270.68253OpenAlexW2152721295MaRDI QIDQ2846508FDOQ2846508
Authors: Binbin Sun, Wing W. Y. Ng, Daniel S. Yeung, Patrick P. K. Chan
Publication date: 5 September 2013
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691313500306
Recommendations
- Advances in Neural Networks – ISNN 2005
- Selection methods for extended least squares support vector machines
- A selection method for hyper-parameters of support vector regression by chaotic cultural algorithm
- Practical selection of SVM parameters and noise estimation for SVM regression
- Efficient optimization of hyper-parameters for least squares support vector regression
sparsityleast squares support vector machine (LS-SVM)sensitivity measurehyper-parameter selectionlocalized generalization error model (L-GEM)
Cites Work
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- Benchmarking least squares support vector machine classifiers
- Evolution strategies based adaptive \(L_{p}\) LS-SVM
- Efficient cross-validation for kernelized least-squares regression with sparse basis expansions
- Sparse kernel learning with LASSO and Bayesian inference algorithm
- Feature selection using localized generalization error for supervised classification problems using RBFNN
Cited In (3)
This page was built for publication: Hyper-parameter selection for sparse LS-SVM via minimization of its localized generalization error
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2846508)