Estimates of learning rates of regularized regression via polyline functions
From MaRDI portal
Publication:3118875
DOI10.1002/mma.1550zbMath1234.68336MaRDI QIDQ3118875
Yongquan Zhang, Joonwhoan Lee, Feilong Cao
Publication date: 5 March 2012
Published in: Mathematical Methods in the Applied Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/mma.1550
62J02: General nonlinear regression
68T05: Learning and adaptive systems in artificial intelligence
41A46: Approximation by arbitrary nonlinear expressions; widths and entropy
Cites Work
- Learning rates for regularized classifiers using multivariate polynomial kernels
- A note on different covering numbers in learning theory.
- The covering number in learning theory
- Optimal rates for the regularized least-squares algorithm
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
- Covering numbers for support vector machines