Oracle inequalities for support vector machines that are based on random entropy numbers
From MaRDI portal
Publication:731974
DOI10.1016/j.jco.2009.06.002zbMath1192.68540OpenAlexW2158432264WikidataQ59196398 ScholiaQ59196398MaRDI QIDQ731974
Publication date: 9 October 2009
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2009.06.002
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Generalization properties of doubly stochastic learning algorithms ⋮ Kernel-based maximum correntropy criterion with gradient descent method ⋮ Nonasymptotic analysis of robust regression with modified Huber's loss ⋮ Estimating conditional quantiles with the help of the pinball loss ⋮ Fast learning from \(\alpha\)-mixing observations ⋮ Calibration of \(\epsilon\)-insensitive loss in support vector machines regression ⋮ Learning rates for kernel-based expectile regression ⋮ Distributed regularized least squares with flexible Gaussian kernels ⋮ Measuring the Capacity of Sets of Functions in the Analysis of ERM
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multi-kernel regularized classifiers
- Fast rates for support vector machines using Gaussian kernels
- The covering number in learning theory
- Weak convergence and empirical processes. With applications to statistics
- Statistical performance of support vector machines
- The sizes of compact subsets of Hilbert space and continuity of Gaussian processes
- Support Vector Machines
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- A new concentration result for regularized risk minimizers
- Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
- Improving the sample complexity using global data
- Learning Theory
- 10.1162/1532443041424337
- Learning Theory