Optimal learning rates for least squares regularized regression with unbounded sampling
Publication:617656
DOI10.1016/J.JCO.2010.10.002zbMath1217.65024OpenAlexW2125875378MaRDI QIDQ617656
Publication date: 21 January 2011
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2010.10.002
learning algorithmsGaussian noisecovering numberleast squares regressionregularization in reproducing kernel Hilbert spaces
Linear regression; mixed models (62J05) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items (39)
Cites Work
- Unnamed Item
- Unnamed Item
- Model selection for regularized least-squares algorithm in learning theory
- Regularization in kernel learning
- Multi-kernel regularized classifiers
- Derivative reproducing properties for kernel methods in learning theory
- Optimal rates for the regularized least-squares algorithm
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Probability Inequalities for the Sum of Independent Random Variables
- SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
- Support Vector Machines
- Capacity of reproducing kernel spaces in learning theory
- A new concentration result for regularized risk minimizers
- ONLINE LEARNING WITH MARKOV SAMPLING
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Leave-One-Out Bounds for Kernel Methods
- Learning Theory
This page was built for publication: Optimal learning rates for least squares regularized regression with unbounded sampling