LowRankModels

LowRankModels.jl is a julia package for modeling and fitting generalized low rank models (GLRMs). GLRMs model a data array by a low rank matrix, and include many well known models in data analysis, such as principal components analysis (PCA), matrix completion, robust PCA, nonnegative matrix factorization, k-means, and many more. For more information on GLRMs, see our paper. There is a python interface to this package, and a GLRM implementation in the H2O machine learning platform with interfaces in a variety of languages. LowRankModels.jl makes it easy to mix and match loss functions and regularizers to construct a model suitable for a particular data set. In particular, it supports: using different loss functions for different columns of the data array, which is useful when data types are heterogeneous (eg, real, boolean, and ordinal columns); fitting the model to only some of the entries in the table, which is useful for data tables with many missing (unobserved) entries; and adding offsets and scalings to the model without destroying sparsity, which is useful when the data is poorly scaled.


References in zbMATH (referenced in 37 articles , 1 standard article )

Showing results 1 to 20 of 37.
Sorted by year (citations)

1 2 next

  1. Berk Wheelock, Lauren; Pachamanova, Dessislava A.: Acceptable set topic modeling (2022)
  2. Bigot, Jérémie; Deledalle, Charles: Low-rank matrix denoising for count data using unbiased Kullback-Leibler risk estimation (2022)
  3. Saul, Lawrence K.: A nonlinear matrix decomposition for mining the zeros of sparse data (2022)
  4. Abdolali, Maryam; Gillis, Nicolas: Simplex-structured matrix factorization: sparsity-based identifiability and provably correct algorithms (2021)
  5. Lin, Kevin Z.; Lei, Jing; Roeder, Kathryn: Exponential-family embedding with application to cell developmental trajectories for single-cell RNA-seq data (2021)
  6. Meng, Zhiqing; Jiang, Min; Shen, Rui; Xu, Leiyan; Dang, Chuangyin: An objective penalty function method for biconvex programming (2021)
  7. Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
  8. Bossmann, Florian; Ma, Jianwei: Enhanced image approximation using shifted rank-1 reconstruction (2020)
  9. Chen, Yunxiao; Li, Xiaoou; Zhang, Siliang: Structured latent factor analysis for large-scale data: identifiability, estimability, and their implications (2020)
  10. Galuzzi, B. G.; Giordani, I.; Candelieri, A.; Perego, R.; Archetti, F.: Hyperparameter optimization for recommender systems through Bayesian optimization (2020)
  11. Hong, David; Kolda, Tamara G.; Duersch, Jed A.: Generalized canonical polyadic tensor decomposition (2020)
  12. Kallus, Nathan; Udell, Madeleine: Dynamic assortment personalization in high dimensions (2020)
  13. Kolda, Tamara G.; Hong, David: Stochastic gradients for large-scale tensor decomposition (2020)
  14. Landgraf, Andrew J.; Lee, Yoonkyung: Dimensionality reduction for binary data through the projection of natural parameters (2020)
  15. Li, Xinrong; Xiu, Naihua; Zhou, Shenglong: Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers (2020)
  16. Lumbreras, Alberto; Filstroff, Louis; Févotte, Cédric: Bayesian mean-parameterized nonnegative binary matrix factorization (2020)
  17. Robin, Geneviève; Klopp, Olga; Josse, Julie; Moulines, Éric; Tibshirani, Robert: Main effects and interactions in mixed and incomplete data frames (2020)
  18. Shen, Rui; Meng, Zhiqing; Jiang, Min: Smoothing partially exact penalty function of biconvex programming (2020)
  19. Sportisse, Aude; Boyer, Claire; Josse, Julie: Imputation and low-rank estimation with missing not at random data (2020)
  20. Alaya, Mokhtar Z.; Klopp, Olga: Collective matrix completion (2019)

1 2 next