glmnet

R package glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.


References in zbMATH (referenced in 426 articles , 1 standard article )

Showing results 21 to 40 of 426.
Sorted by year (citations)

previous 1 2 3 4 ... 20 21 22 next

  1. Posch, Konstantin; Arbeiter, Maximilian; Pilz, Juergen: A novel Bayesian approach for variable selection in linear regression models (2020)
  2. Rachael C. Aikens, Joseph Rigdon, Justin Lee, Michael Baiocchi, Jonathan Chen: Stratified Pilot Matching in R: The stratamatch Package (2020) arXiv
  3. Renaux, Claude; Buzdugan, Laura; Kalisch, Markus; Bühlmann, Peter: Hierarchical inference for genome-wide association studies: a view on methodology with software (2020)
  4. Ren, Sheng; Kang, Emily L.; Lu, Jason L.: MCEN: a method of simultaneous variable selection and clustering for high-dimensional multinomial regression (2020)
  5. Sauk, Benjamin; Ploskas, Nikolaos; Sahinidis, Nikolaos: GPU parameter tuning for tall and skinny dense linear least squares problems (2020)
  6. Sayan Putatunda, Dayananda Ubrangala, Kiran Rama, Ravi Kondapalli: DriveML: An R Package for Driverless Machine Learning (2020) arXiv
  7. Schmid, Matthias; Welchowski, Thomas; Wright, Marvin N.; Berger, Moritz: Discrete-time survival forests with Hellinger distance decision trees (2020)
  8. Takano, Yuichi; Miyashiro, Ryuhei: Best subset selection via cross-validation criterion (2020)
  9. Tang, Lu; Zhou, Ling; Song, Peter X.-K.: Distributed simultaneous inference in generalized linear models via confidence distribution (2020)
  10. Tardivel, Patrick J. C.; Servien, Rémi; Concordet, Didier: Simple expressions of the Lasso and SLOPE estimators in low-dimension (2020)
  11. Wang, Fan; Mukherjee, Sach; Richardson, Sylvia; Hill, Steven M.: High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking (2020)
  12. Winkler, Joab R.; Mitrouli, Marilena: Condition estimation for regression and feature selection (2020)
  13. Wu, Ho-Hsiang; Ferreira, Marco A. R.; Elkhouly, Mohamed; Ji, Tieming: Hyper nonlocal priors for variable selection in generalized linear models (2020)
  14. Wu, Yichong; Li, Tiejun; Liu, Xiaoping; Chen, Luonan: Differential network inference via the fused D-trace loss with cross variables (2020)
  15. Xu, Xinyi; Li, Xiangjie; Zhang, Jingxiao: Regularization methods for high-dimensional sparse control function models (2020)
  16. Zhang, Ning; Wu, Jia; Zhang, Liwei: A linearly convergent majorized ADMM with indefinite proximal terms for convex composite programming and its applications (2020)
  17. Zhao, Yaqing; Bondell, Howard: Solution paths for the generalized Lasso with applications to spatially varying coefficients regression (2020)
  18. Agor, Joseph; Özaltın, Osman Y.: Feature selection for classification models via bilevel optimization (2019)
  19. Ahonen, Ilmari; Nevalainen, Jaakko; Larocque, Denis: Prediction with a flexible finite mixture-of-regressions (2019)
  20. Algamal, Zakariya Yahya; Lee, Muhammad Hisyam: A two-stage sparse logistic regression for optimal gene selection in high-dimensional microarray data classification (2019)

previous 1 2 3 4 ... 20 21 22 next