SLEP: Sparse Learning with Efficient Projections. Main Features: 1) First-Order Method. At each iteration, we only need to evaluate the function value and the gradient; and thus the algorithms can handle large-scale sparse data. 2) Optimal Convergence Rate. The convergence rate O(1/k2) is optimal for smooth convex optimization via the first-order black-box methods. 3) Efficient Projection. The projection problem (proximal operator) can be solved efficiently. 4) Pathwise Solutions. The SLEP package provides functions that efficiently compute the pathwise solutions corresponding to a series of regularization parameters by the “warm-start” technique.

References in zbMATH (referenced in 32 articles )

Showing results 1 to 20 of 32.
Sorted by year (citations)

1 2 next

  1. Barbero, Álvaro; Sra, Suvrit: Modular proximal optimization for multidimensional total-variation regularization (2018)
  2. Jin, Fei; Lee, Lung-fei: Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model (2018)
  3. Karl Sjöstrand; Line Clemmensen; Rasmus Larsen; Gudmundur Einarsson; Bjarne Ersbøll: SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling (2018) not zbMATH
  4. Liu, Yanqing; Tao, Jiyuan; Zhang, Huan; Xiu, Xianchao; Kong, Lingchen: Fused LASSO penalized least absolute deviation estimator for high dimensional linear regression (2018)
  5. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  6. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On efficiently solving the subproblems of a level-set method for fused lasso problems (2018)
  7. Xue, Wei; Zhang, Wensheng; Yu, Gaohang: Least absolute deviations learning of multiple tasks (2018)
  8. Li, Ying-Yi; Zhang, Hai-Bin; Li, Fei: A modified proximal gradient method for a family of nonsmooth convex optimization problems (2017)
  9. Yau, Chun Yip; Hui, Tsz Shing: LARS-type algorithm for group Lasso (2017)
  10. Frandi, Emanuele; Ñanculef, Ricardo; Lodi, Stefano; Sartori, Claudio; Suykens, Johan A. K.: Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee (2016)
  11. Zhang, Liangliang; Yang, Longqi; Hu, Guyu; Pan, Zhisong; Li, Zhen: Link prediction via sparse Gaussian graphical model (2016)
  12. Wang, Jie; Wonka, Peter; Ye, Jieping: Lasso screening rules via dual polytope projection (2015)
  13. Xiang, Shuo; Shen, Xiaotong; Ye, Jieping: Efficient nonconvex sparse group feature selection via continuous and discrete optimization (2015)
  14. Yang, Yi; Zou, Hui: A fast unified algorithm for solving group-lasso penalize learning problems (2015)
  15. Zhang, Hai-Bin; Jiang, Jiao-Jiao; Zhao, Yun-Bin: On the proximal Landweber Newton method for a class of nonsmooth convex problems (2015)
  16. Zou, Jian; Fu, Yuli: Split Bregman algorithms for sparse group lasso with application to MRI reconstruction (2015)
  17. Zou, Jian; Fu, Yuli; Zhang, Qiheng; Li, Haifeng: Split Bregman algorithms for multiple measurement vector problem (2015)
  18. Chen, Jianhui; Zhou, Jiayu; Ye, Jieping: Low-rank and sparse multi-task learning (2014)
  19. Li, Leijun; Hu, Qinghua; Wu, Xiangqian; Yu, Daren: Exploration of classification confidence in ensemble learning (2014) ioport
  20. Lin, Xiaodong; Pham, Minh; Ruszczyński, Andrzej: Alternating linearization for structured regularization problems (2014)

1 2 next