SDPNAL+

SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints.In this paper, we present a majorized semismooth Newton-CG augmented Lagrangian method, called SDPNAL+, for semidefinite programming (SDP) with partial or full nonnegative constraints on the matrix variable. SDPNAL+ is a much enhanced version of SDPNAL introduced by X.-Y. Zhao et al. [SIAM J. Optim. 20, No. 4, 1737–1765 (2010; Zbl 1213.90175)] for solving generic SDPs. SDPNAL works very efficiently for nondegenerate SDPs but may encounter numerical difficulty for degenerate ones. Here we tackle this numerical difficulty by employing a majorized semismooth Newton-CG augmented Lagrangian method coupled with a convergent 3-block alternating direction method of multipliers introduced recently by D. Sun et al. [SIAM J. Optim. 25, No. 2, 882–915 (2015; Zbl 06444987)]. Numerical results for various large scale SDPs with or without nonnegative constraints show that the proposed method is not only fast but also robust in obtaining accurate solutions. It outperforms, by a significant margin, two other competitive publicly available first order methods based codes: (1) an alternating direction method of multipliers based solver called SDPAD by Z. Wen et al. [Math. Program. Comput. 2, No. 3–4, 203–230 (2010; Zbl 1206.90088)] and (2) a two-easy-block-decomposition hybrid proximal extragradient method called 2EBD-HPE by R. Monteiro et al. [“A first-order block-decomposition method for solving two-easy-block structured semidefinite programs”, Math. Program. Comput. 6, No. 2, 103–150 (2014; doi:10.1007/s12532-013-0062-7)]. In contrast to these two codes, we are able to solve all the 95 difficult SDP problems arising from the relaxations of quadratic assignment problems tested in SDPNAL to an accuracy of 10 -6 efficiently, while SDPAD and 2EBD-HPE successfully solve 30 and 16 problems, respectively. In addition, SDPNAL+ appears to be the only viable method currently available to solve large scale SDPs arising from rank-1 tensor approximation problems constructed by J. Nie and L. Wang [SIAM J. Matrix Anal. Appl. 35, No. 3, 1155–1179 (2014; Zbl 1305.65134)]. The largest rank-1 tensor approximation problem we solved (in about 14.5 h) is nonsym(21,4), in which its resulting SDP problem has matrix dimension n=9261 and the number of equality constraints m=12,326,390.


References in zbMATH (referenced in 35 articles , 2 standard articles )

Showing results 1 to 20 of 35.
Sorted by year (citations)

1 2 next

  1. Chen, Shixiang; Ma, Shiqian; Man-Cho So, Anthony; Zhang, Tong: Proximal gradient method for nonsmooth optimization over the Stiefel manifold (2020)
  2. Ding, Chao; Sun, Defeng; Sun, Jie; Toh, Kim-Chuan: Spectral operators of matrices: semismoothness and characterizations of the generalized Jacobian (2020)
  3. Gaar, Elisabeth; Rendl, Franz: A computational study of exact subgraph based SDP bounds for max-cut, stable set and coloring (2020)
  4. Goulart, Paul J.; Nakatsukasa, Yuji; Rontsis, Nikitas: Accuracy of approximate projection to the semidefinite cone (2020)
  5. Li, Xiaodong; Li, Yang; Ling, Shuyang; Strohmer, Thomas; Wei, Ke: When do birds of a feather flock together? (k)-means, proximity, and conic programming (2020)
  6. Sun, Defeng; Toh, Kim-Chuan; Yuan, Yancheng; Zhao, Xin-Yuan: SDPNAL+: A Matlab software for semidefinite programming with bound constraints (version 1.0) (2020)
  7. Yang, Qingzhi; Li, Yiyong; Huang, Pengfei: A novel formulation of the max-cut problem and related algorithm (2020)
  8. Ahmadi, Amir Ali; Majumdar, Anirudha: DSOS and SDSOS optimization: more tractable alternatives to sum of squares and semidefinite optimization (2019)
  9. Campos, Juan S.; Misener, Ruth; Parpas, Panos: A multilevel analysis of the Lasserre hierarchy (2019)
  10. Cui, Ying; Sun, Defeng; Toh, Kim-Chuan: On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming (2019)
  11. Cui, Ying; Sun, Defeng; Toh, Kim-Chuan: Computing the best approximation over the intersection of a polyhedral set and the doubly nonnegative cone (2019)
  12. Eisenach, Carson; Liu, Han: Efficient, certifiably optimal clustering with applications to latent variable graphical models (2019)
  13. Hu, Shenglong; Sun, Defeng; Toh, Kim-Chuan: Best nonnegative rank-one approximations of tensors (2019)
  14. Ito, Naoki; Kim, Sunyoung; Kojima, Masakazu; Takeda, Akiko; Toh, Kim-Chuan: Algorithm 996: BBCPOP: a sparse doubly nonnegative relaxation of polynomial optimization problems with binary, box, and complementarity constraints (2019)
  15. Khoo, Yuehaw; Ying, Lexing: Convex relaxation approaches for strictly correlated density functional theory (2019)
  16. Komeiji, Hikaru; Kim, Sunyoung; Yamashita, Makoto: On the conditions for the finite termination of ADMM and its applications to SOS polynomials feasibility problems (2019)
  17. Liu, Ya-Feng; Liu, Xin; Ma, Shiqian: On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming (2019)
  18. Tran-Dinh, Quoc; Sun, Tianxiao; Lu, Shu: Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms (2019)
  19. Chang, Xiaokai; Liu, Sanyang; Zhao, Pengjun: A note on the sufficient initial condition ensuring the convergence of directly extended 3-block ADMM for special semidefinite programming (2018)
  20. Cui, Ying; Pang, Jong-Shi; Sen, Bodhisattva: Composite difference-MAX programs for modern statistical estimation problems (2018)

1 2 next