XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples

References in zbMATH (referenced in 40 articles )

Showing results 1 to 20 of 40.
Sorted by year (citations)

1 2 next

  1. Berk, Richard A.: Statistical learning from a regression perspective (2020)
  2. Boehmke, Brad; Greenwell, Brandon M.: Hands-on machine learning with R (2020)
  3. Coqueret, Guillaume; Guida, Tony: Training trees on tails with applications to portfolio choice (2020)
  4. Gubela, Robin M.; Lessmann, Stefan; Jaroszewicz, Szymon: Response transformation and profit decomposition for revenue uplift modeling (2020)
  5. Huang, Shih-Feng; Guo, Meihui; Chen, May-Ru: Stock market trend prediction using a functional time series approach (2020)
  6. Kharrat, Tarak; McHale, Ian G.; Peña, Javier López: Plus-minus player ratings for soccer (2020)
  7. Mahajan, Pravar Dilip; Maurya, Abhinav; Megahed, Aly; Elwany, Alaa; Strong, Ray; Blomberg, Jeanette: Optimizing predictive precision in imbalanced datasets for actionable revenue change prediction (2020)
  8. Ruehle, Fabian: Data science applications to string theory (2020)
  9. Sandeep Singh Sandha, Mohit Aggarwal, Igor Fedorov, Mani Srivastava: MANGO: A Python Library for Parallel Hyperparameter Tuning (2020) arXiv
  10. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)
  11. Berrar, Daniel; Lopes, Philippe; Dubitzky, Werner: Incorporating domain knowledge in machine learning for soccer outcome prediction (2019)
  12. Biau, G.; Cadre, B.; Rouvière, L.: Accelerated gradient boosting (2019)
  13. Chen, Li-Pang; Yi, Grace Y.; Zhang, Qihuang; He, Wenqing: Multiclass analysis and prediction with network structured covariates (2019)
  14. Chvalovský, Karel; Jakubův, Jan; Suda, Martin; Urban, Josef: ENIGMA-NG: efficient neural and gradient-boosted inference guidance for (\mathrmE) (2019)
  15. Huan, Er-Yang; Wen, Gui-Hua: Multilevel and multiscale feature aggregation in deep networks for facial constitution classification (2019)
  16. Hubáček, Ondřej; Šourek, Gustav; Železný, Filip: Learning to predict soccer results from relational data with gradient boosted trees (2019)
  17. Jaeger, Byron C.; Long, D. Leann; Long, Dustin M.; Sims, Mario; Szychowski, Jeff M.; Min, Yuan-I; McClure, Leslie A.; Howard, George; Simon, Noah: Oblique random survival forests (2019)
  18. Kocbek, Primoz; Fijacko, Nino; Soguero-Ruiz, Cristina; Mikalsen, Karl Øyvind; Maver, Uros; Povalej Brzan, Petra; Stozer, Andraz; Jenssen, Robert; Skrøvseth, Stein Olav; Stiglic, Gregor: Maximizing interpretability and cost-effectiveness of surgical site infection (SSI) predictive models using feature-specific regularized logistic regression on preoperative temporal data (2019)
  19. Pei, Ziang; Cao, Shuangliang; Lu, Lijun; Chen, Wufan: Direct cellularity estimation on breast cancer histopathology images using transfer learning (2019)
  20. Tian, Xiaolu; Chong, Yutian; Huang, Yutao; Guo, Pi; Li, Mengjie; Zhang, Wangjian; Du, Zhicheng; Li, Xiangyong; Hao, Yuantao: Using machine learning algorithms to predict hepatitis B surface antigen seroclearance (2019)

1 2 next