SLEP: Sparse Learning with Efficient Projections. Main Features: 1) First-Order Method. At each iteration, we only need to evaluate the function value and the gradient; and thus the algorithms can handle large-scale sparse data. 2) Optimal Convergence Rate. The convergence rate O(1/k2) is optimal for smooth convex optimization via the first-order black-box methods. 3) Efficient Projection. The projection problem (proximal operator) can be solved efficiently. 4) Pathwise Solutions. The SLEP package provides functions that efficiently compute the pathwise solutions corresponding to a series of regularization parameters by the “warm-start” technique.

References in zbMATH (referenced in 34 articles )

Showing results 1 to 20 of 34.
Sorted by year (citations)

1 2 next

  1. Zhang, Yangjing; Zhang, Ning; Sun, Defeng; Toh, Kim-Chuan: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems (2020)
  2. Zhu, Li; Huo, Zhiguang; Ma, Tianzhou; Oesterreich, Steffi; Tseng, George C.: Bayesian indicator variable selection to incorporate hierarchical overlapping group structure in multi-omics applications (2019)
  3. Barbero, Álvaro; Sra, Suvrit: Modular proximal optimization for multidimensional total-variation regularization (2018)
  4. Jin, Fei; Lee, Lung-fei: Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model (2018)
  5. Karl Sjöstrand; Line Clemmensen; Rasmus Larsen; Gudmundur Einarsson; Bjarne Ersbøll: SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling (2018) not zbMATH
  6. Liu, Yanqing; Tao, Jiyuan; Zhang, Huan; Xiu, Xianchao; Kong, Lingchen: Fused LASSO penalized least absolute deviation estimator for high dimensional linear regression (2018)
  7. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On efficiently solving the subproblems of a level-set method for fused lasso problems (2018)
  8. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  9. Xue, Wei; Zhang, Wensheng; Yu, Gaohang: Least absolute deviations learning of multiple tasks (2018)
  10. Li, Ying-Yi; Zhang, Hai-Bin; Li, Fei: A modified proximal gradient method for a family of nonsmooth convex optimization problems (2017)
  11. Yau, Chun Yip; Hui, Tsz Shing: LARS-type algorithm for group Lasso (2017)
  12. Frandi, Emanuele; Ñanculef, Ricardo; Lodi, Stefano; Sartori, Claudio; Suykens, Johan A. K.: Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee (2016)
  13. Zhang, Liangliang; Yang, Longqi; Hu, Guyu; Pan, Zhisong; Li, Zhen: Link prediction via sparse Gaussian graphical model (2016)
  14. Wang, Jie; Wonka, Peter; Ye, Jieping: Lasso screening rules via dual polytope projection (2015)
  15. Xiang, Shuo; Shen, Xiaotong; Ye, Jieping: Efficient nonconvex sparse group feature selection via continuous and discrete optimization (2015)
  16. Yang, Yi; Zou, Hui: A fast unified algorithm for solving group-lasso penalize learning problems (2015)
  17. Zhang, Hai-Bin; Jiang, Jiao-Jiao; Zhao, Yun-Bin: On the proximal Landweber Newton method for a class of nonsmooth convex problems (2015)
  18. Zou, Jian; Fu, Yuli: Split Bregman algorithms for sparse group lasso with application to MRI reconstruction (2015)
  19. Zou, Jian; Fu, Yuli; Zhang, Qiheng; Li, Haifeng: Split Bregman algorithms for multiple measurement vector problem (2015)
  20. Chen, Jianhui; Zhou, Jiayu; Ye, Jieping: Low-rank and sparse multi-task learning (2014)

1 2 next