Coefficient-based l^q-regularized regression with indefinite kernels and unbounded sampling
DOI10.1016/J.JAT.2018.07.003zbMATH Open1398.68437OpenAlexW2887553323MaRDI QIDQ1784975FDOQ1784975
Authors: Qin Guo, Cheng Wang, Pei-Xin Ye
Publication date: 27 September 2018
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2018.07.003
Recommendations
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- Least square regression with indefinite kernels and coefficient regularization
- Coefficient-based regression with non-identical unbounded sampling
- Regularization networks with indefinite kernels
indefinite kernellearning ratecoefficient-based regularized regressionmoment hypothesisstepping stone function
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Title not available (Why is that?)
- Some results on Tchebycheffian spline functions and stochastic processes
- Learning Theory
- On the mathematical foundations of learning
- Title not available (Why is that?)
- Learning theory estimates for coefficient-based regularized regression
- Concentration estimates for learning with unbounded sampling
- Learning theory estimates via integral operators and their approximations
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Least square regression with indefinite kernels and coefficient regularization
- Unified approach to coefficient-based regularized regression
- Multiscale kernels
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Learning rates for least square regressions with coefficient regularization
- Constructive analysis for coefficient regularization regression algorithms
- Coefficient-based regularized regression with dependent and unbounded sampling
- Coefficient regularized regression with non-iid sampling
Cited In (11)
- Indefinite kernel network with \(l^q\)-norm regularization
- Coefficient-based regression with non-identical unbounded sampling
- Least square regression with indefinite kernels and coefficient regularization
- Analysis of regression algorithms with unbounded sampling
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Coefficient-based regularized regression with dependent and unbounded sampling
- Coefficient-based regularized distribution regression
- Optimality of the rescaled pure greedy learning algorithms
- Learning with convex loss and indefinite kernels
- Regularization networks with indefinite kernels
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
This page was built for publication: Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784975)