Bayesian additive regression trees with model trees

From MaRDI portal
Publication:2058722

DOI10.1007/S11222-021-09997-3zbMATH Open1475.62055arXiv2006.07493OpenAlexW3135504614MaRDI QIDQ2058722FDOQ2058722

Estevรฃo B. Prado, Andrew C. Parnell, Rafael A. Moral

Publication date: 9 December 2021

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: Bayesian Additive Regression Trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of non-linearity and high-order interactions. In this paper, we introduce an extension of BART, called Model Trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.


Full work available at URL: https://arxiv.org/abs/2006.07493





Cites Work


Cited In (5)

Uses Software


Recommendations





This page was built for publication: Bayesian additive regression trees with model trees

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2058722)