BART: Bayesian additive regression trees

From MaRDI portal
Publication:65651

DOI10.1214/09-AOAS285zbMATH Open1189.62066arXiv0806.3286OpenAlexW3099006712MaRDI QIDQ65651FDOQ65651


Authors: Hugh A. Chipman, Edward I. George, Robert E. McCulloch, Hugh Chipman, Edward I. George, Robert E. McCulloch Edit this on Wikidata


Publication date: 1 March 2010

Published in: The Annals of Applied Statistics (Search for Journal in Brave)

Abstract: We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART's many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.


Full work available at URL: https://arxiv.org/abs/0806.3286




Recommendations




Cites Work


Cited In (only showing first 100 items - show all)

Uses Software





This page was built for publication: BART: Bayesian additive regression trees

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q65651)