Posterior concentration for Bayesian regression trees and forests (Q2215727)

From MaRDI portal
Revision as of 02:27, 19 April 2024 by Importer (talk | contribs) (‎Changed an Item)
scientific article
Language Label Description Also known as
English
Posterior concentration for Bayesian regression trees and forests
scientific article

    Statements

    Posterior concentration for Bayesian regression trees and forests (English)
    0 references
    0 references
    0 references
    14 December 2020
    0 references
    The authors consider the already classical nonparametric regression model \(Y_i=f_0(\mathbf{x} _i)+\epsilon_i\), where \(\mathbf{x}_i=(x_{i1},\dots, x_{ip})'\), \(1\leq i\leq n\), are \(p\) potential covariates, \(Y_i\) are the responses and \(\epsilon _i\) are the noise variables. The statistical problem is to recover \(f_0\) from the samples (\(\mathbf{x}_i\), \(Y_i\)). In the present paper, from the class of nonparametric prediction methods, Bayesian regression trees and forests are considered and studied. As stated by the authors, the goal of this paper is to provide optimality results for Bayesian regression trees. They introduce a new variant of the Bayesian CART prior for dimension reduction and model-free variable selection, the spike-and-tree prior. A set of theoretical results is provided. Some basics are presented in the second section and the definitions of recursive partitions in the third one. Notions as valid partitions, balanced partitions, tree partitions, \(k\)-d tree partitions, tree-structured step functions are explained. In the fourth section one introduces the concept of spike-and-tree priors and one shows that the posterior distribution under the Bayesian CART prior has optimal properties. A detailed analysis of the collective behavior of partitioning cells generated by individual trees is done in the fifth section. A result regarding the posterior concentration for Bayesian additive regression trees in case \(f_0\) has an additive structure is presented in the sixth section. There follow some implementation considerations as well as a short discussion on the results. The eighth section and a supplementary material entitled: Supplement to ``Posterior concentration for Bayesian regression trees and forests'', \url{doi:10.1214/19-AOS1879SUPP} contain detailed proofs of the results.
    0 references
    additive regression
    0 references
    asymptotic minimaxity
    0 references
    BART
    0 references
    Bayesian CART
    0 references
    posterior concentration
    0 references
    recursive partitioning
    0 references
    regression trees
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references