A new perspective on boosting in linear regression via subgradient optimization and relatives (Q682283)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: A new perspective on boosting in linear regression via subgradient optimization and relatives |
scientific article
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | A new perspective on boosting in linear regression via subgradient optimization and relatives |
scientific article |
Statements
A new perspective on boosting in linear regression via subgradient optimization and relatives (English)
0 references
14 February 2018
0 references
Boosting methods in statistical estimation, as e.g. in linear regression and classification, being linear combinations of simple estimators, or stage-wise procedures, have been interpreted by gradient-descent-type algorithms in some function spaces. Here, representations of boosting methods are given by means of subgradient descent procedures minimizing a certain maximum absolute correlation function. Moreover, some modifications of boosting methods and their convergence properties are provided. Numerical examples are given.
0 references
linear regression
0 references
boosting
0 references
convex prigramming
0 references
0.7615243792533875
0 references
0.7604975700378418
0 references
0.7593392133712769
0 references
0.7346910238265991
0 references