A closer look at covering number bounds for Gaussian kernels
From MaRDI portal
Publication:1996885
Abstract: We establish some new bounds on the log-covering numbers of (anisotropic) Gaussian reproducing kernel Hilbert spaces. Unlike previous results in this direction we focus on small explicit constants and their dependency on crucial parameters such as the kernel bandwidth and the size and dimension of the underlying space.
Recommendations
Cites work
- scientific article; zbMATH DE number 44592 (Why is no real title available?)
- scientific article; zbMATH DE number 3215519 (Why is no real title available?)
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Capacity of reproducing kernel spaces in learning theory
- Covering numbers of Gaussian reproducing kernel Hilbert spaces
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Gaussian processes for machine learning.
- Learning Theory
- Learning rates for kernel-based expectile regression
- Optimal regression rates for SVMs using Gaussian kernels
- Support Vector Machines
- The covering number in learning theory
Cited in
(3)
This page was built for publication: A closer look at covering number bounds for Gaussian kernels
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1996885)