Accelerated gradient boosting

From MaRDI portal
Publication:2425242

DOI10.1007/S10994-019-05787-1zbMATH Open1493.68293arXiv1803.02042OpenAlexW2794238513WikidataQ128448954 ScholiaQ128448954MaRDI QIDQ2425242FDOQ2425242

Benoît Cadre, Gérard Biau, Laurent Rouvière

Publication date: 26 June 2019

Published in: Machine Learning (Search for Journal in Brave)

Abstract: Gradient tree boosting is a prediction algorithm that sequentially produces a model in the form of linear combinations of decision trees, by solving an infinite-dimensional optimization problem. We combine gradient boosting and Nesterov's accelerated descent to design a new algorithm, which we call AGB (for Accelerated Gradient Boosting). Substantial numerical evidence is provided on both synthetic and real-life data sets to assess the excellent performance of the method in a large variety of prediction problems. It is empirically shown that AGB is much less sensitive to the shrinkage parameter and outputs predictors that are considerably more sparse in the number of trees, while retaining the exceptional performance of gradient boosting.


Full work available at URL: https://arxiv.org/abs/1803.02042





Cites Work


Cited In (6)

Uses Software






This page was built for publication: Accelerated gradient boosting

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425242)