Faster SVM Training via Conjugate SMO
From MaRDI portal
Publication:6337032
arXiv2003.08719MaRDI QIDQ6337032FDOQ6337032
Authors: Alberto Torres-Barrán, Carlos Alaíz, José R. Dorronsoro
Publication date: 19 March 2020
Abstract: We propose an improved version of the SMO algorithm for training classification and regression SVMs, based on a Conjugate Descent procedure. This new approach only involves a modest increase on the computational cost of each iteration but, in turn, usually results in a substantial decrease in the number of iterations required to converge to a given precision. Besides, we prove convergence of the iterates of this new Conjugate SMO as well as a linear rate when the kernel matrix is positive definite. We have implemented Conjugate SMO within the LIBSVM library and show experimentally that it is faster for many hyper-parameter configurations, being often a better option than second order SMO when performing a grid-search for SVM tuning.
Has companion code repository: https://github.com/niccolot/SciKit_plusplus
This page was built for publication: Faster SVM Training via Conjugate SMO
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6337032)