BayesTree: Bayesian Methods for Tree Based Models: Implementation of BART: Bayesian Additive Regression Trees. We develop a Bayesian “sum-of-trees” model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART’s many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.

References in zbMATH (referenced in 59 articles , 1 standard article )

Showing results 1 to 20 of 59.
Sorted by year (citations)

1 2 3 next

  1. Rodney Sparapani, Charles Spanbauer, Robert McCulloch: Nonparametric Machine Learning and Efficient Computation with Bayesian Additive Regression Trees: The BART R Package (2021) not zbMATH
  2. Antonelli, Joseph; Daniels, Michael J.: Discussion of PENCOMP (2019)
  3. Athey, Susan; Tibshirani, Julie; Wager, Stefan: Generalized random forests (2019)
  4. Carnegie, Nicole Bohme: Comment: Contributions of model features to BART causal inference performance using ACIC 2016 competition data (2019)
  5. Crawford, Lorin; Flaxman, Seth R.; Runcie, Daniel E.; West, Mike: Variable prioritization in nonlinear black box methods: a genetic association case study (2019)
  6. Dorie, Vincent; Hill, Jennifer; Shalit, Uri; Scott, Marc; Cervone, Dan: Automated versus do-it-yourself methods for causal inference: lessons learned from a data analysis competition (2019)
  7. Nethery, Rachel C.; Mealli, Fabrizia; Dominici, Francesca: Estimating population average causal effects in the presence of non-overlap: the effect of natural gas compressor station exposure on cancer mortality (2019)
  8. Park, Soyoung; Carriquiry, Alicia: Learning algorithms to evaluate forensic glass evidence (2019)
  9. Zeldow, Bret; Lo Re, Vincent III; Roy, Jason: A semiparametric modeling approach using Bayesian additive regression trees with an application to evaluate heterogeneous treatment effects (2019)
  10. Angelino, Elaine; Larus-Stone, Nicholas; Alabi, Daniel; Seltzer, Margo; Rudin, Cynthia: Learning certifiably optimal rule lists for categorical data (2018)
  11. Hernández, Belinda; Raftery, Adrian E.; Pennington, Stephen R.; Parnell, Andrew C.: Bayesian additive regression trees using Bayesian model averaging (2018)
  12. Liang, Faming; Li, Qizhai; Zhou, Lei: Bayesian neural networks for selection of drug sensitive genes (2018)
  13. Nalenz, Malte; Villani, Mattias: Tree ensembles with rule structured horseshoe regularization (2018)
  14. Wager, Stefan; Athey, Susan: Estimation and inference of heterogeneous treatment effects using random forests (2018)
  15. Conversano, Claudio; Dusseldorp, Elise: Modeling threshold interaction effects through the logistic classification trunk (2017)
  16. Goessling, Marc: Logitboost autoregressive networks (2017)
  17. Guo, Wentian; Ji, Yuan; Catenacci, Daniel V. T.: A subgroup cluster-based Bayesian adaptive design for precision medicine (2017)
  18. Hu, Ruimeng; Ludkovsk, Mike: Sequential design for ranking response surfaces (2017)
  19. Lansangan, Joseph Ryan G.; Barrios, Erniel B.: Simultaneous dimension reduction and variable selection in modeling high dimensional data (2017)
  20. Adam Kapelner and Justin Bleich: bartMachine: Machine Learning with Bayesian Additive Regression Trees (2016) not zbMATH

1 2 3 next