Learning by atomic norm regularization with polynomial kernels
From MaRDI portal
Publication:3451221
DOI10.1142/S0219691315500356zbMath1343.68192MaRDI QIDQ3451221
Publication date: 10 November 2015
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691315500356
62G08: Nonparametric regression and quantile regression
62G20: Asymptotic properties of nonparametric inference
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Optimality of the rescaled pure greedy learning algorithms, Coefficient-based regularized regression with dependent and unbounded sampling
Cites Work
- Unnamed Item
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Learning theory approach to a system identification problem involving atomic norm
- Derivative reproducing properties for kernel methods in learning theory
- Fast rates for support vector machines using Gaussian kernels
- The convex geometry of linear inverse problems
- An approximation theory approach to learning with \(\ell^1\) regularization
- Learning with sample dependent hypothesis spaces
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Local polynomial reproduction and moving least squares approximation
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- Error estimates for scattered data interpolation on spheres