gbm

gbm: Generalized Boosted Regression Models. This package implements extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart).


References in zbMATH (referenced in 60 articles )

Showing results 1 to 20 of 60.
Sorted by year (citations)

1 2 3 next

  1. Crevecoeur, Jonas; Robben, Jens; Antonio, Katrien: A hierarchical reserving model for reported non-life insurance claims (2022)
  2. Lellep, Martin; Prexl, Jonathan; Eckhardt, Bruno; Linkmann, Moritz: Interpreted machine learning in fluid dynamics: explaining relaminarisation events in wall-bounded shear flows (2022)
  3. Welchowski, Thomas; Maloney, Kelly O.; Mitchell, Richard; Schmid, Matthias: Techniques to improve ecological interpretability of black-box machine learning models. Case study on biological health of streams in the United States with gradient boosted trees (2022)
  4. Agarwal, Satish; Singh, Ranjan Kumar; Ganguly, Adity; Kumar, Abhishek; Shrivastava, Shweta; Kumar, Ramesh; Ranjan, Rajeev; Vikas: Prediction of Coke CSR using time series model in coke plant (2021)
  5. Jared D. Huling, Menggang Yu: Subgroup Identification Using the personalized Package (2021) not zbMATH
  6. Miron B. Kursa: Praznik: High performance information-based feature selection (2021) not zbMATH
  7. Mistry, Miten; Letsios, Dimitrios; Krennrich, Gerhard; Lee, Robert M.; Misener, Ruth: Mixed-integer convex nonlinear optimization with gradient-boosted trees embedded (2021)
  8. Berk, Richard A.: Statistical learning from a regression perspective (2020)
  9. Chu, Jianghao; Lee, Tae-Hwy; Ullah, Aman: Component-wise AdaBoost algorithms for high-dimensional binary classification and class probability prediction (2020)
  10. Elman, Miriam R.; Minnier, Jessica; Chang, Xiaohui; Choi, Dongseok: Noise accumulation in high dimensional classification and total signal index (2020)
  11. Mišić, Velibor V.: Optimization of tree ensembles (2020)
  12. Pérez-Chacón, R.; Asencio-Cortés, G.; Martínez-Álvarez, F.; Troncoso, A.: Big data time series forecasting based on pattern sequence similarity and its application to the electricity demand (2020)
  13. Tianhui Zhou, Guangyu Tong, Fan Li, Laine E. Thomas, Fan Li: PSweight: An R Package for Propensity Score Weighting Analysis (2020) arXiv
  14. van den Bergh, Don; Bogaerts, Stefan; Spreen, Marinus; Flohr, Rob; Vandekerckhove, Joachim; Batchelder, William H.; Wagenmakers, Eric-Jan: Cultural consensus theory for the evaluation of patients’ mental health scores in forensic psychiatric hospitals (2020)
  15. Alireza S. Mahani; Mansour T.A. Sharabiani: Bayesian, and Non-Bayesian, Cause-Specific Competing-Risk Analysis for Parametric and Nonparametric Survival Functions: The R Package CFC (2019) not zbMATH
  16. Azmi, Mohamed; Runger, George C.; Berrado, Abdelaziz: Interpretable regularized class association rules algorithm for classification in a categorical data space (2019)
  17. Biau, G.; Cadre, B.; Rouvière, L.: Accelerated gradient boosting (2019)
  18. Cerqueira, Vitor; Torgo, Luís; Pinto, Fábio; Soares, Carlos: Arbitrage of forecasting experts (2019)
  19. Choi, Byeong Yeob; Wang, Chen-Pin; Michalek, Joel; Gelfond, Jonathan: Power comparison for propensity score methods (2019)
  20. Ramosaj, Burim; Pauly, Markus: Predicting missing values: a comparative study on non-parametric approaches for imputation (2019)

1 2 3 next