SOFAR: large-scale association network learning. Many modern big data applications feature large scale in both numbers of responses and predictors. Better statistical efficiency and scientific insights can be enabled by understanding the large-scale response-predictor association network structures via layers of sparse latent factors ranked by importance. Yet sparsity and orthogonality have been two largely incompatible goals. To accommodate both features, in this paper, we suggest the method of sparse orthogonal factor regression (SOFAR) via the sparse singular value decomposition with orthogonality constrained optimization to learn the underlying association networks, with broad applications to both unsupervised and supervised learning tasks, such as biclustering with sparse singular value decomposition, sparse principal component analysis, sparse factor analysis, and spare vector autoregression analysis. Exploiting the framework of convexity-assisted nonconvex optimization, we derive nonasymptotic error bounds for the suggested procedure characterizing the theoretical advantages. The statistical guarantees are powered by an efficient SOFAR algorithm with convergence property. Both computational and theoretical advantages of our procedure are demonstrated with several simulations and real data examples.
Keywords for this software
References in zbMATH (referenced in 4 articles , 1 standard article )
Showing results 1 to 4 of 4.
- Wu, Jie; Zheng, Zemin; Li, Yang; Zhang, Yi: Scalable interpretable learning for multi-response error-in-variables regression (2020)
- Yu, Ming; Gupta, Varun; Kolar, Mladen: Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach (2020)
- Ren, Zhao; Kang, Yongjian; Fan, Yingying; Lv, Jinchi: Tuning-free heterogeneous inference in massive networks (2019)
- Uematsu, Yoshimasa; Fan, Yingying; Chen, Kun; Lv, Jinchi; Lin, Wei: SOFAR: large-scale association network learning (2019)