Learning Rates for Classification with Gaussian Kernels
From MaRDI portal
Publication:5380881
DOI10.1162/neco_a_00968zbMath1456.68155arXiv1702.08701OpenAlexW2594741298WikidataQ50443978 ScholiaQ50443978MaRDI QIDQ5380881
Jinshan Zeng, Shao-Bo Lin, Xiang Yu Chang
Publication date: 6 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.08701
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications ⋮ Fully corrective gradient boosting with squared hinge: fast learning rates and early stopping ⋮ Optimal learning with Gaussians and correntropy loss
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Logistic classification with varying gaussians
- Classification with Gaussians and convex loss. II: Improving error bounds by noise conditions
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Covering numbers of Gaussian reproducing kernel Hilbert spaces
- Sparsity in multiple kernel learning
- Multi-kernel regularized classifiers
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Learning and approximation by Gaussians on Riemannian manifolds
- Fast rates for support vector machines using Gaussian kernels
- The covering number in learning theory
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Learning theory estimates for coefficient-based regularized regression
- Optimal regression rates for SVMs using Gaussian kernels
- Statistical performance of support vector machines
- Support vector machines: theory and applications.
- Approximation with polynomial kernels and SVM classifiers
- 10.1162/153244302760185252
- ONLINE REGRESSION WITH VARYING GAUSSIANS AND NON-IDENTICAL DISTRIBUTIONS
- Learning Theory
- Support Vector Machines
- Capacity of reproducing kernel spaces in learning theory
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Minimax nonparametric classification .I. Rates of convergence
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- A Note on Support Vector Machines with Polynomial Kernels
- Convexity, Classification, and Risk Bounds
- Some properties of Gaussian reproducing kernel Hilbert spaces and their implications for function approximation and learning theory
This page was built for publication: Learning Rates for Classification with Gaussian Kernels