The learning rates of regularized regression based on reproducing kernel Banach spaces
DOI10.1155/2013/694181zbMATH Open1470.68174OpenAlexW2124978107WikidataQ58916646 ScholiaQ58916646MaRDI QIDQ2318985FDOQ2318985
Authors: Bao-Huai Sheng, Pei-Xin Ye
Publication date: 16 August 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2013/694181
Recommendations
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Learning rates of least-square regularized regression
- Reproducing kernel Banach spaces for machine learning
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
- Optimal rate of the regularized regression learning algorithm
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Cites Work
- Theory of Reproducing Kernels
- Learning Theory
- On the mathematical foundations of learning
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Inequalities in Banach spaces with applications
- Title not available (Why is that?)
- Title not available (Why is that?)
- Reproducing kernel Banach spaces for machine learning
- The covering number in learning theory
- Covering Numbers for Convex Functions
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Title not available (Why is that?)
- Sampling and reconstruction of signals in a reproducing kernel subspace of \(L^p(\mathbb R^d)\)
- Approximation with polynomial kernels and SVM classifiers
- The covering number for some Mercer kernel Hilbert spaces
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels
- Uniform convergence of Bernstein-Durrmeyer operators with respect to arbitrary measure
- Multivariate Bernstein-Durrmeyer operators with arbitrary weight functions
- Minimization of Tikhonov functionals in Banach spaces
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Learning Theory
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Marcinkiewicz-Zygmund measures on manifolds
- Generalized semi-inner products with applications to regularized learning
- Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual
- Reproducing kernel Hilbert spaces associated with analytic translation-invariant Mercer kernels
- The covering number for some Mercer kernel Hilbert spaces on the unit sphere
- Aggregation of SVM classifiers using Sobolev spaces
- Modulus of continuity conditions for Jacobi series
- Title not available (Why is that?)
Cited In (10)
- Convergence rate of SVM for kernel-based robust regression
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
- Error analysis on Hérmite learning with gradient data
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Solutions of nonlinear systems by reproducing kernel method
- Kernel regression estimation in a Banach space
- On the \(K\)-functional in learning theory
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Regularized learning schemes in feature Banach spaces
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
This page was built for publication: The learning rates of regularized regression based on reproducing kernel Banach spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2318985)