softImpute: Matrix Completion via Iterative Soft-Thresholded SVD. Iterative methods for matrix completion that use nuclear-norm regularization. There are two main approaches.The one approach uses iterative soft-thresholded svds to impute the missing values. The second approach uses alternating least squares. Both have an ”EM” flavor, in that at each iteration the matrix is completed with the current estimate. For large matrices there is a special sparse-matrix class named ”Incomplete” that efficiently handles all computations. The package includes procedures for centering and scaling rows, columns or both, and for computing low-rank SVDs on large sparse centered matrices (i.e. principal components)

References in zbMATH (referenced in 45 articles , 2 standard articles )

Showing results 1 to 20 of 45.
Sorted by year (citations)

1 2 3 next

  1. Wong, Raymond K. W.; Zhang, Xiaoke: Nonparametric operator-regularized covariance function estimation for functional data (2019)
  2. Amjad, Muhammad; Shah, Devavrat; Shen, Dennis: Robust synthetic control (2018)
  3. Bertsimas, Dimitris; Pawlowski, Colin; Zhuo, Ying Daisy: From predictive methods to missing data imputation: an optimization approach (2018)
  4. Cottet, Vincent; Alquier, Pierre: 1-bit matrix completion: PAC-Bayesian analysis of a variational approximation (2018)
  5. Fithian, William; Mazumder, Rahul: Flexible low-rank statistical modeling with missing data and side information (2018)
  6. Lee, Namgil; Kim, Jong-Min: Block tensor train decomposition for missing data estimation (2018)
  7. Moreira, Nilson J. M.; Duarte, Leonardo T.; Lavor, Carlile; Torezzan, Cristiano: A novel low-rank matrix completion approach to estimate missing entries in Euclidean distance matrix (2018)
  8. O’Rourke, Sean; Vu, Van; Wang, Ke: Random perturbation of low rank matrices: improving classical bounds (2018)
  9. Shabat, Gil; Shmueli, Yaniv; Aizenbud, Yariv; Averbuch, Amir: Randomized LU decomposition (2018)
  10. Veretennikova, Maria A.; Sikorskii, Alla; Boivin, Michael J.: Parameters of stochastic models for electroencephalogram data as biomarkers for child’s neurodevelopment after cerebral malaria (2018)
  11. Yao, Quanming; Kwok, James T.: Efficient learning with a family of nonconvex regularizers by redistributing nonconvexity (2018)
  12. Bouwmans, Thierry; Sobral, Andrews; Javed, Sajid; Jung, Soon Ki; Zahzah, El-Hadi: Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset (2017)
  13. Brockmeier, Austin J.; Mu, Tingting; Ananiadou, Sophia; Goulermas, John Y.: Quantifying the informativeness of similarity measurements (2017)
  14. Chi, Eric C.; Allen, Genevera I.; Baraniuk, Richard G.: Convex biclustering (2017)
  15. Durante, Daniele: A note on the multiplicative gamma process (2017)
  16. Eghbali, Reza; Fazel, Maryam: Decomposable norm minimization with proximal-gradient homotopy algorithm (2017)
  17. Freund, Robert M.; Grigas, Paul; Mazumder, Rahul: An extended Frank-Wolfe method with “in-face” directions, and its application to low-rank matrix completion (2017)
  18. Mareček, Jakub; Richtárik, Peter; Takáč, Martin: Matrix completion under interval uncertainty (2017)
  19. Monga, Vishal; Mousavi, Hojjat Seyed; Srinivas, Umamahesh: Sparsity constrained estimation in image processing and computer vision (2017)
  20. Wong, Raymond K. W.; Lee, Thomas C. M.: Matrix completion with noisy entries and outliers (2017)

1 2 3 next