Support vector machines regression with \(l^1\)-regularizer
DOI10.1016/j.jat.2012.06.005zbMath1284.68313OpenAlexW2002164713MaRDI QIDQ1759352
Di-Rong Chen, Fenghong Yang, Hongzhi Tong
Publication date: 20 November 2012
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2012.06.005
reproducing kernel Hilbert spacescoefficient regularizationlearning rateerror decompositionsupport vector machines regression
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (4)
Cites Work
- Unnamed Item
- Unnamed Item
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Least square regression with indefinite kernels and coefficient regularization
- Model selection for regularized least-squares algorithm in learning theory
- Analysis of support vector machines regression
- Regularization networks and support vector machines
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- On the mathematical foundations of learning
- Learning Theory
- Decoding by Linear Programming
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Theory of Reproducing Kernels
This page was built for publication: Support vector machines regression with \(l^1\)-regularizer