Support vector machines regression with l^1-regularizer
DOI10.1016/J.JAT.2012.06.005zbMATH Open1284.68313OpenAlexW2002164713MaRDI QIDQ1759352FDOQ1759352
Authors: Hongzhi Tong, Di-Rong Chen, Fenghong Yang
Publication date: 20 November 2012
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2012.06.005
Recommendations
- Analysis of support vector machines regression
- A simpler approach to coefficient regularized support vector machines regression
- Least squares regression with \(l_1\)-regularizer in sum space
- Learning with varying insensitive loss
- An approximation theory approach to learning with \(\ell^1\) regularization
reproducing kernel Hilbert spacescoefficient regularizationlearning rateerror decompositionsupport vector machines regression
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Computational learning theory (68Q32)
Cites Work
- Regularization networks and support vector machines
- Title not available (Why is that?)
- Theory of Reproducing Kernels
- Learning Theory
- On the mathematical foundations of learning
- Decoding by Linear Programming
- Support vector machine soft margin classifiers: error analysis
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Least square regression with indefinite kernels and coefficient regularization
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Model selection for regularized least-squares algorithm in learning theory
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Analysis of support vector machines regression
Cited In (11)
- Title not available (Why is that?)
- The equivalence of support vector machine and regularization neural networks
- A simpler approach to coefficient regularized support vector machines regression
- Regularization networks and support vector machines
- Kernel methods for independence measurement with coefficient constraints
- Analysis of support vector machines regression
- A \(q\)-norm regularizing least-square-support-vector-machine linear classifier algorithm via iterative reweighted conjugate gradient
- Regularized ranking with convex losses and \(\ell^1\)-penalty
- Learning with varying insensitive loss
- Classification with polynomial kernels and \(l^1\)-coefficient regularization
- ℓ1-Norm support vector machine for ranking with exponentially strongly mixing sequence
This page was built for publication: Support vector machines regression with \(l^1\)-regularizer
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1759352)