A simpler approach to coefficient regularized support vector machines regression (Q1722337)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: A simpler approach to coefficient regularized support vector machines regression |
scientific article; zbMATH DE number 7021924
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | A simpler approach to coefficient regularized support vector machines regression |
scientific article; zbMATH DE number 7021924 |
Statements
A simpler approach to coefficient regularized support vector machines regression (English)
0 references
14 February 2019
0 references
Summary: We consider a kind of support vector machines regression (SVMR) algorithms associated with \(l^q\) (\(1 \leq q < \infty\)) coefficient-based regularization and data-dependent hypothesis space. Compared with former literature, we provide here a simpler convergence analysis for those algorithms. The novelty of our analysis lies in the estimation of the hypothesis error, which is implemented by setting a stepping stone between the coefficient regularized SVMR and the classical SVMR. An explicit learning rate is then derived under very mild conditions.
0 references
0 references
0 references
0 references
0.8345082402229309
0 references
0.8322992920875549
0 references
0.7945197820663452
0 references
0.7831665277481079
0 references
0.780144453048706
0 references