sparsenet

R package SparseNet: coordinate descent with nonconvex penalties. We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. We pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for this approach, study their corresponding threshold functions, and describe a df-standardizing reparametrization that assists our pathwise algorithm. The MC+ penalty is ideally suited to this task, and we use it to demonstrate the performance of our algorithm. Certain technical derivations and experiments related to this article are included in the supplementary materials section.


References in zbMATH (referenced in 94 articles , 1 standard article )

Showing results 1 to 20 of 94.
Sorted by year (citations)

1 2 3 4 5 next

  1. Ben-Ameur, Walid; Neto, José: New bounds for subset selection from conic relaxations (2022)
  2. Li, Peili; Lu, Xiliang; Xiao, Yunhai: Smoothing Newton method for (\ell^0)-(\ell^2) regularized linear inverse problem (2022)
  3. Xia, Siwei; Yang, Yuehan; Yang, Hu: Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems (2022)
  4. Atamturk, Alper; Gomez, Andres; Han, Shaoning: Sparse and smooth signal estimation: convexification of (\ell_0)-formulations (2021)
  5. Bertsimas, Dimitris; Pauphilet, Jean; Van Parys, Bart: Sparse classification: a scalable discrete optimization perspective (2021)
  6. Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon T.: The horseshoe-like regularization for feature subset selection (2021)
  7. Dedieu, Antoine; Hazimeh, Hussein; Mazumder, Rahul: Learning sparse classifiers: continuous and mixed integer optimization perspectives (2021)
  8. Huang, Jian; Jiao, Yuling; Jin, Bangti; Liu, Jin; Lu, Xiliang; Yang, Can: A unified primal dual active set algorithm for nonconvex sparse recovery (2021)
  9. Jeong, Himchan; Chang, Hyunwoong; Valdez, Emiliano A.: A non-convex regularization approach for stable estimation of loss development factors (2021)
  10. Pham, Minh; Lin, Xiaodong; Ruszczyński, Andrzej; Du, Yu: An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems (2021)
  11. Sun, Ruoyu; Ye, Yinyu: Worst-case complexity of cyclic coordinate descent: (O(n^2)) gap with randomized version (2021)
  12. Zhang, Cheng; Dinh, Vu; Matsen, Frederick A. IV: Nonbifurcating phylogenetic tree inference via the adaptive Lasso (2021)
  13. Buccini, Alessandro; De la Cruz Cabrera, Omar; Donatelli, Marco; Martinelli, Andrea; Reichel, Lothar: Large-scale regression with non-convex loss and penalty (2020)
  14. Carlsson, Marcus; Gerosa, Daniele; Olsson, Carl: An unbiased approach to compressed sensing (2020)
  15. Gao, Yuanjun; Goetz, Jack; Connelly, Matthew; Mazumder, Rahul: Mining events with declassified diplomatic documents (2020)
  16. Griffin, Maryclare; Hoff, Peter D.: Testing sparsity-inducing penalties (2020)
  17. Hastie, Trevor; Tibshirani, Robert; Tibshirani, Ryan: Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons (2020)
  18. Hazimeh, Hussein; Mazumder, Rahul: Fast best subset selection: coordinate descent and local combinatorial optimization algorithms (2020)
  19. Mazumder, Rahul; Saldana, Diego; Weng, Haolei: Matrix completion with nonconvex regularization: spectral operators and scalable algorithms (2020)
  20. Mazumder, Rahul; Weng, Haolei: Computing the degrees of freedom of rank-regularized estimators and cousins (2020)

1 2 3 4 5 next