The learning rates of regularized regression based on reproducing kernel Banach spaces
From MaRDI portal
Publication:2318985
Recommendations
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Learning rates of least-square regularized regression
- Reproducing kernel Banach spaces for machine learning
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
- Optimal rate of the regularized regression learning algorithm
Cites work
- scientific article; zbMATH DE number 4177781 (Why is no real title available?)
- scientific article; zbMATH DE number 713412 (Why is no real title available?)
- scientific article; zbMATH DE number 1113627 (Why is no real title available?)
- scientific article; zbMATH DE number 1502618 (Why is no real title available?)
- Aggregation of SVM classifiers using Sobolev spaces
- Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels
- Approximation with polynomial kernels and SVM classifiers
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Covering Numbers for Convex Functions
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Generalized semi-inner products with applications to regularized learning
- Inequalities in Banach spaces with applications
- Learning Theory
- Learning Theory
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Marcinkiewicz-Zygmund measures on manifolds
- Minimization of Tikhonov functionals in Banach spaces
- Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual
- Modulus of continuity conditions for Jacobi series
- Multivariate Bernstein-Durrmeyer operators with arbitrary weight functions
- On the mathematical foundations of learning
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Reproducing kernel Banach spaces for machine learning
- Reproducing kernel Hilbert spaces associated with analytic translation-invariant Mercer kernels
- Sampling and reconstruction of signals in a reproducing kernel subspace of \(L^p(\mathbb R^d)\)
- The covering number for some Mercer kernel Hilbert spaces
- The covering number for some Mercer kernel Hilbert spaces on the unit sphere
- The covering number in learning theory
- Theory of Reproducing Kernels
- Uniform convergence of Bernstein-Durrmeyer operators with respect to arbitrary measure
Cited in
(10)- On the \(K\)-functional in learning theory
- Convergence rate of SVM for kernel-based robust regression
- Error analysis on Hérmite learning with gradient data
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Kernel regression estimation in a Banach space
- Regularized learning schemes in feature Banach spaces
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Solutions of nonlinear systems by reproducing kernel method
This page was built for publication: The learning rates of regularized regression based on reproducing kernel Banach spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2318985)