glmnet

R package glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.


References in zbMATH (referenced in 482 articles , 1 standard article )

Showing results 21 to 40 of 482.
Sorted by year (citations)

previous 1 2 3 4 ... 23 24 25 next

  1. Furmańczyk, Konrad; Rejchel, Wojciech: High-dimensional linear model selection motivated by multiple testing (2020)
  2. García-Portugués, Eduardo; Álvarez-Liébana, Javier; Álvarez-Pérez, Gonzalo; González-Manteiga, Wenceslao: Goodness-of-fit tests for functional linear models based on integrated projections (2020)
  3. Gold, David; Lederer, Johannes; Tao, Jing: Inference for high-dimensional instrumental variables regression (2020)
  4. Guo, Xiao; Zhang, Hai: Sparse directed acyclic graphs incorporating the covariates (2020)
  5. Hastie, Trevor; Tibshirani, Robert; Tibshirani, Ryan: Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons (2020)
  6. Hazimeh, Hussein; Mazumder, Rahul: Fast best subset selection: coordinate descent and local combinatorial optimization algorithms (2020)
  7. Hong, Mingyi; Chang, Tsung-Hui; Wang, Xiangfeng; Razaviyayn, Meisam; Ma, Shiqian; Luo, Zhi-Quan: A block successive upper-bound minimization method of multipliers for linearly constrained convex optimization (2020)
  8. Huang, Yimin; Kong, Xiangshun; Ai, Mingyao: Optimal designs in sparse linear models (2020)
  9. James, Gareth M.; Paulson, Courtney; Rusmevichientong, Paat: Penalized and constrained optimization: an application to high-dimensional website advertising (2020)
  10. Jeong, Jun-Yong; Kang, Ju-Seok; Jun, Chi-Hyuck: Regularization-based model tree for multi-output regression (2020)
  11. Jeon, Jong-June; Kim, Yongdai; Won, Sungho; Choi, Hosik: Primal path algorithm for compositional data analysis (2020)
  12. Jiang, Guangxin; Hong, L. Jeff; Nelson, Barry L.: Online risk monitoring using offline simulation (2020)
  13. Lai, Yuanhao; McLeod, Ian: Ensemble quantile classifier (2020)
  14. Liu, Wenchen; Tang, Yincai; Wu, Xianyi: Separating variables to accelerate non-convex regularized optimization (2020)
  15. Mainak Jas; Titipat Achakulvisut; Aid Idrizović; Daniel E. Acuna; Matthew Antalek; Vinicius Marques; Tommy Odland; Ravi Prakash Garg; Mayank Agrawal; Yu Umegaki; Peter Foley; Hugo L Fernandes; Drew Harris; Beibin Li; Olivier Pieters; Scott Otterson; Giovanni De Toni; Chris Rodgers; Eva Dyer; Matti Hamalainen; Konrad Kording; Pavan Ramkumar: Pyglmnet: Python implementation of elastic-net regularized generalized linear models (2020) not zbMATH
  16. Nikooienejad, Amir; Wang, Wenyi; Johnson, Valen E.: Bayesian variable selection for survival data using inverse moment priors (2020)
  17. Nima S. Hejazi; Jeremy R. Coyle; Mark J. van der Laan: hal9001: Scalable highly adaptive lasso regression in R (2020) not zbMATH
  18. Oda, Ryoya; Yanagihara, Hirokazu: A fast and consistent variable selection method for high-dimensional multivariate linear regression with a large number of explanatory variables (2020)
  19. Pang, Tongyao; Wu, Chunlin; Liu, Zhifang: A cubic spline penalty for sparse approximation under tight frame balanced model (2020)
  20. Pan, Yuqing; Mai, Qing: Efficient computation for differential network analysis with applications to quadratic discriminant analysis (2020)

previous 1 2 3 4 ... 23 24 25 next