Asymptotic normality of support vector machine variants and other regularized kernel methods

From MaRDI portal
Publication:765834

DOI10.1016/J.JMVA.2011.11.004zbMATH Open1352.62056arXiv1010.0535OpenAlexW1964319030MaRDI QIDQ765834FDOQ765834


Authors: Robert Hable Edit this on Wikidata


Publication date: 22 March 2012

Published in: Journal of Multivariate Analysis (Search for Journal in Brave)

Abstract: In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions, it is shown that the difference between the estimator, i.e. the empirical SVM, and the theoretical SVM is asymptotically normal with rate sqrtn. That is, the standardized difference converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional is suitably Hadamard-differentiable.


Full work available at URL: https://arxiv.org/abs/1010.0535




Recommendations




Cites Work


Cited In (7)





This page was built for publication: Asymptotic normality of support vector machine variants and other regularized kernel methods

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q765834)