RTRMC : Low-rank matrix completion via preconditioned optimization on the Grassmann manifold. We address the numerical problem of recovering large matrices of low rank when most of the entries are unknown. We exploit the geometry of the low-rank constraint to recast the problem as an unconstrained optimization problem on a single Grassmann manifold. We then apply second-order Riemannian trust-region methods (RTRMC 2) and Riemannian conjugate gradient methods (RCGMC) to solve it. A preconditioner for the Hessian is introduced that helps control the conditioning of the problem and we detail preconditioned versions of Riemannian optimization algorithms. The cost of each iteration is linear in the number of known entries. The proposed methods are competitive with state-of-the-art algorithms on a wide range of problem instances. In particular, they perform well on rectangular matrices. We further note that second-order and preconditioned methods are well suited to solve badly conditioned matrix completion tasks.

References in zbMATH (referenced in 34 articles , 1 standard article )

Showing results 1 to 20 of 34.
Sorted by year (citations)

1 2 next

  1. Chen, Shixiang; Ma, Shiqian; Man-Cho So, Anthony; Zhang, Tong: Proximal gradient method for nonsmooth optimization over the Stiefel manifold (2020)
  2. Hosseini, Reshad; Sra, Suvrit: Recent advances in stochastic Riemannian optimization (2020)
  3. Kuang, Shenfen; Chao, Hongyang; Li, Qia: Majorized proximal alternating imputation for regularized rank constrained matrix completion (2020)
  4. Rasheed, Ali S.; Mayah, Faik; Al-Jumaili, Ahmed A. H.: Optimization techniques on affine differential manifolds (2020)
  5. Wei, Ke; Cai, Jian-Feng; Chan, Tony F.; Leung, Shingyu: Guarantees of Riemannian optimization for low rank matrix completion (2020)
  6. Mishra, Bamdev; Kasai, Hiroyuki; Jawanpuria, Pratik; Saroop, Atul: A Riemannian gossip approach to subspace learning on Grassmann manifold (2019)
  7. Seshadri, Pranay; Yuchi, Shaowu; Parks, Geoffrey T.: Dimension reduction via Gaussian ridge functions (2019)
  8. Wang, Jin; Wang, Yan-Ping; Xu, Zhi; Wang, Chuan-Long: Accelerated low rank matrix approximate algorithms for matrix completion (2019)
  9. Wen, Rui-Ping; Li, Shu-Zhen; Zhou, Fang: Toeplitz matrix completion via smoothing augmented Lagrange multiplier algorithm (2019)
  10. Huang, Wen; Absil, P.-A.; Gallivan, K. A.: A Riemannian BFGS method without differentiated retraction for nonconvex optimization problems (2018)
  11. Huang, Wen; Hand, Paul: Blind deconvolution by a steepest descent algorithm on a quotient manifold (2018)
  12. Park, Dohyung; Kyrillidis, Anastasios; Caramanis, Constantine; Sanghavi, Sujay: Finding low-rank solutions via nonconvex matrix factorization, efficiently and provably (2018)
  13. Zimmermann, Ralf; Peherstorfer, Benjamin; Willcox, Karen: Geometric subspace updates with applications to online adaptive nonlinear model reduction (2018)
  14. Huang, Wen; Absil, P.-A.; Gallivan, K. A.: Intrinsic representation of tangent vectors and vector transports on matrix manifolds (2017)
  15. Peng, Dingtao; Xiu, Naihua; Yu, Jian: (S_1/2) regularization methods and fixed point algorithms for affine rank minimization problems (2017)
  16. Wen, Rui-Ping; Liu, Li-Xia: The two-stage iteration algorithms based on the shortest distance for low-rank matrix completion (2017)
  17. Cambier, LĂ©opold; Absil, P.-A.: Robust low-rank matrix completion by Riemannian optimization (2016)
  18. Kressner, Daniel; Steinlechner, Michael; Vandereycken, Bart: Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure (2016)
  19. Mishra, Bamdev; Sepulchre, Rodolphe: Riemannian preconditioning (2016)
  20. Tanner, Jared; Wei, Ke: Low rank matrix completion by alternating steepest descent methods (2016)

1 2 next