glmnet

R package glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.


References in zbMATH (referenced in 422 articles , 1 standard article )

Showing results 41 to 60 of 422.
Sorted by year (citations)

previous 1 2 3 4 5 ... 20 21 22 next

  1. Boot, Tom; Nibbering, Didier: Forecasting using random subspace methods (2019)
  2. Cai, Liqian; Bhattacharjee, Arnab; Calantone, Roger; Maiti, Taps: Variable selection with spatially autoregressive errors: a generalized moments Lasso estimator (2019)
  3. Cai, T. T.; Li, H.; Ma, J.; Xia, Y.: Differential Markov random field analysis with an application to detecting differential microbial community networks (2019)
  4. Cerqueira, Vitor; Torgo, Luís; Pinto, Fábio; Soares, Carlos: Arbitrage of forecasting experts (2019)
  5. Das, Debraj; Gregory, Karl; Lahiri, S. N.: Perturbation bootstrap in adaptive Lasso (2019)
  6. De Micheaux, Pierre Lafaye; Liquet, Benoît; Sutton, Matthew: PLS for Big Data: a unified parallel algorithm for regularised group PLS (2019)
  7. Flaxman, Seth; Chirico, Michael; Pereira, Pau; Loeffler, Charles: Scalable high-resolution forecasting of sparse spatiotemporal events with kernel methods: a winning solution to the NIJ “Real-time crime forecasting challenge” (2019)
  8. Fort, Gersende; Ollier, Edouard; Samson, Adeline: Stochastic proximal-gradient algorithms for penalized mixed models (2019)
  9. Freue, Gabriela V. Cohen; Kepplinger, David; Salibián-Barrera, Matías; Smucler, Ezequiel: Robust elastic net estimators for variable selection and identification of proteomic biomarkers (2019)
  10. Ge, Jason; Li, Xingguo; Jiang, Haoming; Liu, Han; Zhang, Tong; Wang, Mengdi; Zhao, Tuo: \textttpicasso: a sparse learning library for high dimensional data analysis in \textttRand \textttPython (2019)
  11. Gelb, Anne; Hou, X.; Li, Q.: Numerical analysis for conservation laws using (l_1) minimization (2019)
  12. Ghahari, Azar; Newlands, Nathaniel K.; Lyubchich, Vyacheslav; Gel, Yulia R.: Deep learning at the interface of agricultural insurance risk and spatio-temporal uncertainty in weather extremes (2019)
  13. Giurcanu, Mihai; Presnell, Brett: Bootstrapping Lasso-type estimators in regression models (2019)
  14. Guastavino, Sabrina; Benvenuto, Federico: A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery (2019)
  15. Guibert, Quentin; Lopez, Olivier; Piette, Pierrick: Forecasting mortality rate improvements with a high-dimensional VAR (2019)
  16. Gu, Jiaying; Fu, Fei; Zhou, Qing: Penalized estimation of directed acyclic graphs from discrete data (2019)
  17. Guo, Xiao; Zhang, Hai; Wang, Yao; Liang, Yong: Structure learning of sparse directed acyclic graphs incorporating the scale-free property (2019)
  18. Henrique Helfer Hoeltgebaum, Heather Battey: HCmodelSets: An R package for specifying sets of well-fitting models in regression with a large number of potential explanatory variables (2019) arXiv
  19. Hsu, Hsiang-Ling; Chang, Yuan-chin Ivan; Chen, Ray-Bing: Greedy active learning algorithm for logistic regression models (2019)
  20. Jaeger, Byron C.; Long, D. Leann; Long, Dustin M.; Sims, Mario; Szychowski, Jeff M.; Min, Yuan-I; McClure, Leslie A.; Howard, George; Simon, Noah: Oblique random survival forests (2019)

previous 1 2 3 4 5 ... 20 21 22 next