sparsenet

R package SparseNet: coordinate descent with nonconvex penalties. We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. We pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for this approach, study their corresponding threshold functions, and describe a df-standardizing reparametrization that assists our pathwise algorithm. The MC+ penalty is ideally suited to this task, and we use it to demonstrate the performance of our algorithm. Certain technical derivations and experiments related to this article are included in the supplementary materials section.


References in zbMATH (referenced in 68 articles , 1 standard article )

Showing results 1 to 20 of 68.
Sorted by year (citations)

1 2 3 4 next

  1. Xu, Xinyi; Li, Xiangjie; Zhang, Jingxiao: Regularization methods for high-dimensional sparse control function models (2020)
  2. Bhadra, Anindya; Datta, Jyotishka; Li, Yunfan; Polson, Nicholas G.; Willard, Brandon: Prediction risk for the horseshoe regression (2019)
  3. Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon: Lasso meets horseshoe: a survey (2019)
  4. Georgios Exarchakis, Jörg Bornschein, Abdul-Saboor Sheikh, Zhenwen Dai, Marc Henniges, Jakob Drefs, Jörg Lücke: ProSper - A Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions (2019) arXiv
  5. Lee, Seokho; Kim, Seonhwa: Marginalized Lasso in sparse regression (2019)
  6. Mak, Simon; Wu, C. F. Jeff: \textsfcmenet: A new method for bi-level variable selection of conditional main effects (2019)
  7. Meulman, Jacqueline J.; van der Kooij, Anita J.; Duisters, Kevin L. W.: ROS regression: integrating regularization with optimal scaling regression (2019)
  8. Moran, Gemma E.; Ročková, Veronika; George, Edward I.: Variance prior forms for high-dimensional Bayesian variable selection (2019)
  9. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Soltys, Michal Frej, Jan Mielniczuk: Improving Lasso for model selection and prediction (2019) arXiv
  10. Shi, Yueyong; Xu, Deyi; Cao, Yongxiu; Jiao, Yuling: Variable selection via generalized SELO-penalized Cox regression models (2019)
  11. Umezu, Yuta; Shimizu, Yusuke; Masuda, Hiroki; Ninomiya, Yoshiyuki: AIC for the non-concave penalized likelihood method (2019)
  12. Adachi, Kohei; Trendafilov, Nickolay T.: Sparsest factor analysis for clustering variables: a matrix decomposition approach (2018)
  13. Choiruddin, Achmad; Coeurjolly, Jean-François; Letué, Frédérique: Convex and non-convex regularization methods for spatial point processes intensity estimation (2018)
  14. Hirose, Kei; Imada, Miyuki: Sparse factor regression via penalized maximum likelihood estimation (2018)
  15. Huang, Jian; Jiao, Yuling; Liu, Yanyan; Lu, Xiliang: A constructive approach to (L_0) penalized regression (2018)
  16. Jin, Shaobo; Moustaki, Irini; Yang-Wallentin, Fan: Approximated penalized maximum likelihood for exploratory factor analysis: an orthogonal case (2018)
  17. Li, Xingguo; Zhao, Tuo; Arora, Raman; Liu, Han; Hong, Mingyi: On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization (2018)
  18. Ročková, Veronika; George, Edward I.: The spike-and-slab LASSO (2018)
  19. Shi, Yue-Yong; Cao, Yong-Xiu; Yu, Ji-Chang; Jiao, Yu-Ling: Variable selection via generalized SELO-penalized linear regression models (2018)
  20. Shi, Yue Yong; Jiao, Yu Ling; Cao, Yong Xiu; Liu, Yan Yan: An alternating direction method of multipliers for MCP-penalized regression with high-dimensional data (2018)

1 2 3 4 next