Gibbs Priors for Bayesian Nonparametric Variable Selection with Weak Learners
From MaRDI portal
Publication:6180734
Cites work
- Adaptive Conditional Distribution Estimation with Bayesian Decision Tree Ensembles
- An introduction to statistical learning. With applications in R
- Automated versus do-it-yourself methods for causal inference: lessons learned from a data analysis competition
- BART: Bayesian additive regression trees
- Bayesian backfitting. (With comments and a rejoinder).
- Bayesian regression tree ensembles that adapt to smoothness and sparsity
- Bayesian regression tree models for causal inference: regularization, confounding, and heterogeneous effects (with discussion)
- Bayesian regression trees for high-dimensional prediction and variable selection
- Boosting for high-dimensional linear models
- Ferguson distributions via Polya urn schemes
- Greedy function approximation: A gradient boosting machine.
- Heteroscedastic BART via Multiplicative Regression Trees
- Hierarchical Dirichlet Processes
- Hierarchical species sampling models
- Log-linear Bayesian additive regression trees for multinomial logistic and count regression models
- Mixture models with a prior on the number of components
- Optimal predictive model selection.
- Posterior concentration for Bayesian regression trees and forests
- Semiparametric analysis of clustered interval‐censored survival data using soft Bayesian additive regression trees (SBART)
- Semiparametric mixed-scale models using shared Bayesian forests
- Surface estimation, variable selection, and the nonparametric oracle property
- Variable Selection with ABC Bayesian Forests
- Variable selection for BART: an application to gene regulation
This page was built for publication: Gibbs Priors for Bayesian Nonparametric Variable Selection with Weak Learners
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6180734)