Learning by atomic norm regularization with polynomial kernels
From MaRDI portal
Publication:3451221
DOI10.1142/S0219691315500356zbMath1343.68192OpenAlexW2180617948MaRDI QIDQ3451221
Publication date: 10 November 2015
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691315500356
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Optimality of the rescaled pure greedy learning algorithms ⋮ Coefficient-based regularized regression with dependent and unbounded sampling
Cites Work
- Unnamed Item
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Learning theory approach to a system identification problem involving atomic norm
- Derivative reproducing properties for kernel methods in learning theory
- Fast rates for support vector machines using Gaussian kernels
- The convex geometry of linear inverse problems
- An approximation theory approach to learning with \(\ell^1\) regularization
- Learning with sample dependent hypothesis spaces
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Local polynomial reproduction and moving least squares approximation
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- Error estimates for scattered data interpolation on spheres
This page was built for publication: Learning by atomic norm regularization with polynomial kernels