Optimal regression rates for SVMs using Gaussian kernels
From MaRDI portal
Publication:1951100
DOI10.1214/12-EJS760zbMath1337.62073WikidataQ59196380 ScholiaQ59196380MaRDI QIDQ1951100
Publication date: 29 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1357913280
62G08: Nonparametric regression and quantile regression
62G05: Nonparametric estimation
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Unnamed Item, Unnamed Item, Multikernel Regression with Sparsity Constraint, Unnamed Item, Intrinsic Dimension Adaptive Partitioning for Kernel Methods, Density-Difference Estimation, Filtering with State-Observation Examples via Kernel Monte Carlo Filter, Learning Theory Estimates with Observations from General Stationary Stochastic Processes, Learning Rates for Classification with Gaussian Kernels, Optimal learning with Gaussians and correntropy loss, Quantile regression with \(\ell_1\)-regularization and Gaussian kernels, Learning rates for kernel-based expectile regression, An SVM-like approach for expectile regression, Optimal learning with anisotropic Gaussian SVMs, The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs, A closer look at covering number bounds for Gaussian kernels, Distributed regularized least squares with flexible Gaussian kernels, Adaptive learning rates for support vector machines working on data with low intrinsic dimension, Learning rates for the kernel regularized regression with a differentiable strongly convex loss, Moving quantile regression, Stable splittings of Hilbert spaces of functions of infinitely many variables, Fast learning from \(\alpha\)-mixing observations, Interpretable Dynamic Treatment Regimes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimating conditional quantiles with the help of the pinball loss
- Model selection for regularized least-squares algorithm in learning theory
- Regularization in kernel learning
- Learning and approximation by Gaussians on Riemannian manifolds
- Theory of function spaces
- Fast rates for support vector machines using Gaussian kernels
- Global nonparametric estimation of conditional quantile functions and their derivatives
- A distribution-free theory of nonparametric regression
- Optimal rates for the regularized least-squares algorithm
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Error bounds for learning the kernel
- Interpolation of Besov Spaces
- Quantitative Korovkin Theorems for Positive Linear Operators on L p - Spaces
- SMO Algorithm for Least-Squares SVM Formulations
- EFFICIENT SEMIPARAMETRIC ESTIMATION OF A PARTIALLY LINEAR QUANTILE REGRESSION MODEL
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Function Classes That Approximate the Bayes Risk
- Quantile Regression in Reproducing Kernel Hilbert Spaces
- Theory of Reproducing Kernels