Fast rates for support vector machines using Gaussian kernels

From MaRDI portal




Abstract: For binary classification we establish learning rates up to the order of n1 for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.



Cites work


Cited in
(only showing first 100 items - show all)






This page was built for publication: Fast rates for support vector machines using Gaussian kernels

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q995417)