Asymptotic normality of support vector machine variants and other regularized kernel methods
From MaRDI portal
Publication:765834
Abstract: In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions, it is shown that the difference between the estimator, i.e. the empirical SVM, and the theoretical SVM is asymptotically normal with rate . That is, the standardized difference converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional is suitably Hadamard-differentiable.
Recommendations
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Consistency and convergence rates of one-class SVMs and related algorithms
- Statistical performance of support vector machines
- Consistency of support vector machines using additive kernels for additive models
- Some properties of regularized kernel methods
Cites Work
- scientific article; zbMATH DE number 5968880 (Why is no real title available?)
- scientific article; zbMATH DE number 1815397 (Why is no real title available?)
- scientific article; zbMATH DE number 2131207 (Why is no real title available?)
- scientific article; zbMATH DE number 3137662 (Why is no real title available?)
- scientific article; zbMATH DE number 1294696 (Why is no real title available?)
- scientific article; zbMATH DE number 724261 (Why is no real title available?)
- scientific article; zbMATH DE number 1972340 (Why is no real title available?)
- scientific article; zbMATH DE number 1476625 (Why is no real title available?)
- scientific article; zbMATH DE number 1391397 (Why is no real title available?)
- scientific article; zbMATH DE number 1420699 (Why is no real title available?)
- scientific article; zbMATH DE number 5055767 (Why is no real title available?)
- scientific article; zbMATH DE number 3281211 (Why is no real title available?)
- scientific article; zbMATH DE number 3327878 (Why is no real title available?)
- scientific article; zbMATH DE number 3367926 (Why is no real title available?)
- A Bahadur representation of the linear support vector machine
- A new concentration result for regularized risk minimizers
- An Introduction to Banach Space Theory
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- Asymptotic Statistics
- Bouligand derivatives and robustness of support vector machines for regression
- Capacity of reproducing kernel spaces in learning theory
- Consistency and robustness of kernel-based regression in convex risk minimization
- Fast rates for support vector machines using Gaussian kernels
- Measure and integration theory. Transl. from the German by Robert B. Burckel
- On qualitative robustness of support vector machines
- On robustness properties of convex risk minimization methods for pattern recognition
- Optimal aggregation of classifiers in statistical learning.
- Optimal rates for the regularized least-squares algorithm
- Real Analysis and Probability
- Regularization in kernel learning
- Smooth \(\epsilon \)-insensitive regression by loss symmetrization
- Statistical performance of support vector machines
- Support Vector Machines
- Weak convergence and empirical processes. With applications to statistics
Cited In (8)
- Model selection in kernel ridge regression
- Non-asymptotic Analysis of $\ell_1$-norm Support Vector Machines
- Asymptotic distribution for regression in a symmetric periodic Gaussian kernel Hilbert space
- On asymptotic properties of hyperparameter estimators for kernel-based regularization methods
- Asymptotic linear expansion of regularized M-estimators
- Deterministic error analysis of support vector regression and related regularized kernel methods
- On conditional risk estimation considering model risk
- Testing subspace restrictions in the presence of high dimensional nuisance parameters
This page was built for publication: Asymptotic normality of support vector machine variants and other regularized kernel methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q765834)