A Note on Support Vector Machines with Polynomial Kernels
From MaRDI portal
Publication:5380381
DOI10.1162/NECO_a_00794zbMath1472.62005OpenAlexW2177503208WikidataQ50547922 ScholiaQ50547922MaRDI QIDQ5380381
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00794
Computational methods for problems pertaining to statistics (62-08) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multi-kernel regularized classifiers
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Fast rates for support vector machines using Gaussian kernels
- Support vector machines are universally consistent
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Support-vector networks
- Regularization networks and support vector machines
- Approximation with polynomial kernels and SVM classifiers
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Learning Theory
- Support Vector Machines
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Theory of Reproducing Kernels
This page was built for publication: A Note on Support Vector Machines with Polynomial Kernels