flare: Family of Lasso Regression. The package ”flare” provides the implementation of a family of Lasso variants including Dantzig Selector, LAD Lasso, SQRT Lasso, Lq Lasso for estimating high dimensional sparse linear model. We adopt the alternating direction method of multipliers and convert the original optimization problem into a sequential L1 penalized least square minimization problem, which can be efficiently solved by linearization algorithm. A multi-stage screening approach is adopted for further acceleration. Besides the sparse linear model estimation, we also provide the extension of these Lasso variants to sparse Gaussian graphical model estimation including TIGER and CLIME using either L1 or adaptive penalty. Missing values can be tolerated for Dantzig selector and CLIME. The computation is memory-optimized using the sparse matrix output.

References in zbMATH (referenced in 17 articles , 1 standard article )

Showing results 1 to 17 of 17.
Sorted by year (citations)

  1. Pan, Yuqing; Mai, Qing: Efficient computation for differential network analysis with applications to quadratic discriminant analysis (2020)
  2. Wang, Fan; Mukherjee, Sach; Richardson, Sylvia; Hill, Steven M.: High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking (2020)
  3. Blanchet, Jose; Kang, Yang; Murthy, Karthyek: Robust Wasserstein profile inference and applications to machine learning (2019)
  4. Gan, Lingrui; Narisetty, Naveen N.; Liang, Feng: Bayesian regularization for graphical models with unequal shrinkage (2019)
  5. Wang, Yafei; Kong, Linglong; Jiang, Bei; Zhou, Xingcai; Yu, Shimei; Zhang, Li; Heo, Giseon: Wavelet-based LASSO in functional linear quantile regression (2019)
  6. Karl Sjöstrand; Line Clemmensen; Rasmus Larsen; Gudmundur Einarsson; Bjarne Ersbøll: SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling (2018) not zbMATH
  7. Li, Xingguo; Zhao, Tuo; Arora, Raman; Liu, Han; Hong, Mingyi: On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization (2018)
  8. Shi, Yue-Yong; Cao, Yong-Xiu; Yu, Ji-Chang; Jiao, Yu-Ling: Variable selection via generalized SELO-penalized linear regression models (2018)
  9. Shi, Yue Yong; Jiao, Yu Ling; Cao, Yong Xiu; Liu, Yan Yan: An alternating direction method of multipliers for MCP-penalized regression with high-dimensional data (2018)
  10. Shi, Yueyong; Wu, Yuanshan; Xu, Deyi; Jiao, Yuling: An ADMM with continuation algorithm for non-convex SICA-penalized regression in high dimensions (2018)
  11. Stucky, Benjamin; Van de Geer, Sara: Sharp oracle inequalities for square root regularization (2017)
  12. Zhang, Haixiang; Zheng, Yinan; Yoon, Grace; Zhang, Zhou; Gao, Tao; Joyce, Brian; Zhang, Wei; Schwartz, Joel; Vokonas, Pantel; Colicino, Elena; Baccarelli, Andrea; Hou, Lifang; Liu, Lei: Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study (2017)
  13. Antoniadis, Anestis; Gijbels, Irène; Lambert-Lacroix, Sophie; Poggi, Jean-Michel: Joint estimation and variable selection for mean and dispersion in proper dispersion models (2016)
  14. Gueuning, Thomas; Claeskens, Gerda: Confidence intervals for high-dimensional partially linear single-index models (2016)
  15. Han, Fang; Lu, Huanran; Liu, Han: A direct estimation of high dimensional stationary vector autoregressions (2015)
  16. Li, Xingguo; Zhao, Tuo; Yuan, Xiaoming; Liu, Han: The \textttflarepackage for high dimensional linear regression and precision matrix estimation in \textttR (2015)
  17. Pang, Haotian; Liu, Han; Vanderbei, Robert: The FASTCLIME package for linear programming and large-scale precision matrix estimation in R (2014)