Dynamic trees for learning and design
From MaRDI portal
Publication:5256404
Abstract: Dynamic regression trees are an attractive option for automatic regression and classification with complicated response surfaces in on-line application settings. We create a sequential tree model whose state changes in time with the accumulation of new data, and provide particle learning algorithms that allow for the efficient on-line posterior filtering of tree-states. A major advantage of tree regression is that it allows for the use of very simple models within each partition. The model also facilitates a natural division of labor in our sequential particle-based inference: tree dynamics are defined through a few potential changes that are local to each newly arrived observation, while global uncertainty is captured by the ensemble of particles. We consider both constant and linear mean functions at the tree leaves, along with multinomial leaves for classification problems, and propose default prior specifications that allow for prediction to be integrated over all model parameters conditional on a given tree. Inference is illustrated in some standard nonparametric regression examples, as well as in the setting of sequential experiment design, including both active learning and optimization applications, and in on-line classification. We detail implementation guidelines and problem specific methodology for each of these motivating applications. Throughout, it is demonstrated that our practical approach is able to provide better results compared to commonly used methods at a fraction of the cost.
Recommendations
Cited in
(22)- Bayesian nonstationary Gaussian process models via treed process convolutions
- Minimax optimal rates for Mondrian trees and forests
- Heteroscedastic BART via Multiplicative Regression Trees
- Dynamic logistic regression and dynamic model averaging for binary classification
- Bayesian optimal sequential design for nonparametric regression via inhomogeneous evolutionary MCMC
- Deep learning for ranking response surfaces with applications to optimal stopping problems
- Sequential design for optimal stopping problems
- Constrained problem formulations for power optimization of aircraft electro-thermal anti-icing systems
- Sequential design for ranking response surfaces
- Bayesian neural networks for selection of drug sensitive genes
- An efficient surrogate model for emulation and physics extraction of large eddy simulations
- scientific article; zbMATH DE number 2062488 (Why is no real title available?)
- scientific article; zbMATH DE number 7306891 (Why is no real title available?)
- Order-based error for managing ensembles of surrogates in mesh adaptive direct search
- Quantifying uncertainty with ensembles of surrogates for blackbox optimization
- Variable selection for BART: an application to gene regulation
- Particle filters and Bayesian inference in financial econometrics
- Multi-output local Gaussian process regression: applications to uncertainty quantification
- Bayesian sequential experimental design for binary response data with application to electromyographic experiments
- Variable selection and sensitivity analysis using dynamic trees, with an application to computer code performance tuning
- Interpretable classifiers using rules and Bayesian analysis: building a better stroke prediction model
- Towards convergence rate analysis of random forests for classification
This page was built for publication: Dynamic trees for learning and design
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5256404)