sparsenet

R package SparseNet: coordinate descent with nonconvex penalties. We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. We pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for this approach, study their corresponding threshold functions, and describe a df-standardizing reparametrization that assists our pathwise algorithm. The MC+ penalty is ideally suited to this task, and we use it to demonstrate the performance of our algorithm. Certain technical derivations and experiments related to this article are included in the supplementary materials section.


References in zbMATH (referenced in 80 articles , 1 standard article )

Showing results 41 to 60 of 80.
Sorted by year (citations)
  1. Huang, Po-Hsien; Chen, Hung; Weng, Li-Jen: A penalized likelihood method for structural equation modeling (2017)
  2. Mkhadri, Abdallah; Ouhourane, Mohamed; Oualkacha, Karim: A coordinate descent algorithm for computing penalized smooth quantile regression (2017)
  3. Simon Mak, C. F. Jeff Wu: cmenet: a new method for bi-level variable selection of conditional main effects (2017) arXiv
  4. Suzumura, Shinya; Ogawa, Kohei; Sugiyama, Masashi; Karasuyama, Masayuki; Takeuchi, Ichiro: Homotopy continuation approaches for robust SV classification and regression (2017)
  5. Yamamoto, Michio; Hwang, Heungsun: Dimension-reduced clustering of functional data via subspace separation (2017)
  6. Zeng, Jinshan; Peng, Zhimin; Lin, Shaobo: GAITA: a Gauss-Seidel iterative thresholding algorithm for (\ell_q) regularized least squares regression (2017)
  7. Bertsimas, Dimitris; King, Angela: OR forum: An algorithmic approach to linear regression (2016)
  8. Bertsimas, Dimitris; King, Angela; Mazumder, Rahul: Best subset selection via a modern optimization lens (2016)
  9. Chen, Ting-Huei; Sun, Wei; Fine, Jason P.: Designing penalty functions in high dimensional problems: the role of tuning parameters (2016)
  10. He, Qianchuan; Kong, Linglong; Wang, Yanhua; Wang, Sijian; Chan, Timothy A.; Holland, Eric: Regularized quantile regression under heterogeneous sparsity with application to quantitative genetic traits (2016)
  11. Liu, Hongcheng; Yao, Tao; Li, Runze: Global solutions to folded concave penalized nonconvex learning (2016)
  12. Liu, Zhenqiu; Li, Gang: Efficient regularized regression with (L_0) penalty for variable selection and network construction (2016)
  13. Zhang, Xiang; Wu, Yichao; Wang, Lan; Li, Runze: A consistent information criterion for support vector machines in diverging model spaces (2016)
  14. Aragam, Bryon; Zhou, Qing: Concave penalized estimation of sparse Gaussian Bayesian networks (2015)
  15. Hirose, Kei; Yamamoto, Michio: Sparse estimation via nonconcave penalized likelihood in factor analysis model (2015)
  16. Loh, Po-Ling; Wainwright, Martin J.: Regularized (M)-estimators with nonconvexity: statistical and algorithmic theory for local optima (2015)
  17. Pan, Zheng; Zhang, Changshui: Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression (2015)
  18. Wright, Stephen J.: Coordinate descent algorithms (2015)
  19. Xiang, Shuo; Shen, Xiaotong; Ye, Jieping: Efficient nonconvex sparse group feature selection via continuous and discrete optimization (2015)
  20. Zhang, Zhihua; Li, Jin: Compound Poisson processes, latent shrinkage priors and Bayesian nonconvex penalization (2015)