XGBoost

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples


References in zbMATH (referenced in 71 articles )

Showing results 1 to 20 of 71.
Sorted by year (citations)

1 2 3 4 next

  1. Akalin, Altuna: Computational genomics with R. With the assistance of Verdan Franke, Bora Uyar and Jonathan Ronen (2021)
  2. Arash Pakbin, Xiaochen Wang, Bobak J. Mortazavi, Donald K.K. Lee: BoXHED 2.0: Scalable boosting of functional data in survival analysis (2021) arXiv
  3. Bertsimas, Dimitris; Dunn, Jack; Wang, Yuchen: Near-optimal nonlinear regression trees (2021)
  4. Carrizosa, Emilio; Molero-Río, Cristina; Romero Morales, Dolores: Mathematical optimization in classification and regression trees (2021)
  5. Ding, Chenchen; Han, Haitao; Li, Qianyue; Yang, Xiaoxia; Liu, Taigang: iT3SE-PX: identification of bacterial type III secreted effectors using PSSM profiles and XGBoost feature selection (2021)
  6. Fermanian, Adeline: Embedding and learning with signatures (2021)
  7. Poonawala, Hasan A.; Lauffer, Niklas; Topcu, Ufuk: Training classifiers for feedback control with safety in mind (2021)
  8. Yue Zhao, Zhi Qiao, Cao Xiao, Lucas Glass, Jimeng Sun: PyHealth: A Python Library for Health Predictive Models (2021) arXiv
  9. Zhang, Dan; Chen, Hua-Dong; Zulfiqar, Hasan; Yuan, Shi-Shi; Huang, Qin-Lai; Zhang, Zhao-Yue; Deng, Ke-Jun: iBLP: an XGBoost-based predictor for identifying bioluminescent proteins (2021)
  10. Benkeser, David; Petersen, Maya; van der Laan, Mark J.: Improved small-sample estimation of nonlinear cross-validated prediction metrics (2020)
  11. Berk, Richard A.: Statistical learning from a regression perspective (2020)
  12. Boehmke, Brad; Greenwell, Brandon M.: Hands-on machine learning with R (2020)
  13. Bullock, Joseph; Luccioni, Alexandra; Pham, Katherine Hoffman; Lam, Cynthia Sin Nga; Luengo-Oroz, Miguel: Mapping the landscape of artificial intelligence applications against COVID-19 (2020)
  14. Coqueret, Guillaume; Guida, Tony: Training trees on tails with applications to portfolio choice (2020)
  15. Gauthier, Thibault: Tree neural networks in HOL4 (2020)
  16. Gubela, Robin M.; Lessmann, Stefan; Jaroszewicz, Szymon: Response transformation and profit decomposition for revenue uplift modeling (2020)
  17. Gweon, Hyukjun; Li, Shu; Mamon, Rogemar: An effective bias-corrected bagging method for the valuation of large variable annuity portfolios (2020)
  18. Han, Sunwoo; Kim, Hyunjoong; Lee, Yung-Seop: Double random forest (2020)
  19. Huang, Shih-Feng; Guo, Meihui; Chen, May-Ru: Stock market trend prediction using a functional time series approach (2020)
  20. Hubert Baniecki, Wojciech Kretowicz, Piotr Piatyszek, Jakub Wisniewski, Przemyslaw Biecek: dalex: Responsible Machine Learning with Interactive Explainability and Fairness in Python (2020) arXiv

1 2 3 4 next