LowRankModels
LowRankModels.jl is a julia package for modeling and fitting generalized low rank models (GLRMs). GLRMs model a data array by a low rank matrix, and include many well known models in data analysis, such as principal components analysis (PCA), matrix completion, robust PCA, nonnegative matrix factorization, k-means, and many more. For more information on GLRMs, see our paper. There is a python interface to this package, and a GLRM implementation in the H2O machine learning platform with interfaces in a variety of languages. LowRankModels.jl makes it easy to mix and match loss functions and regularizers to construct a model suitable for a particular data set. In particular, it supports: using different loss functions for different columns of the data array, which is useful when data types are heterogeneous (eg, real, boolean, and ordinal columns); fitting the model to only some of the entries in the table, which is useful for data tables with many missing (unobserved) entries; and adding offsets and scalings to the model without destroying sparsity, which is useful when the data is poorly scaled.
Keywords for this software
References in zbMATH (referenced in 19 articles , 1 standard article )
Showing results 1 to 19 of 19.
Sorted by year (- Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
- Bossmann, Florian; Ma, Jianwei: Enhanced image approximation using shifted rank-1 reconstruction (2020)
- Hong, David; Kolda, Tamara G.; Duersch, Jed A.: Generalized canonical polyadic tensor decomposition (2020)
- Li, Xinrong; Xiu, Naihua; Zhou, Shenglong: Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers (2020)
- Alaya, Mokhtar Z.; Klopp, Olga: Collective matrix completion (2019)
- Bai, Jushan; Ng, Serena: Rank regularized estimation of approximate factor models (2019)
- Balcan, Maria-Florina; Liang, Yingyu; Song, Zhao; Woodruff, David P.; Zhang, Hongyang: Non-convex matrix completion and related problems via strong duality (2019)
- Daneshmand, Amir; Sun, Ying; Scutari, Gesualdo; Facchinei, Francisco; Sadler, Brian M.: Decentralized dictionary learning over time-varying digraphs (2019)
- Driggs, Derek; Becker, Stephen; Aravkin, Aleksandr: Adapting regularized low-rank models for parallel architectures (2019)
- Gillis, Nicolas; Shitov, Yaroslav: Low-rank matrix approximation in the infinity norm (2019)
- Fithian, William; Mazumder, Rahul: Flexible low-rank statistical modeling with missing data and side information (2018)
- Liu, Lydia T.; Dobriban, Edgar; Singer, Amit: (e)PCA: high dimensional exponential family PCA (2018)
- Luo, Chongliang; Liang, Jian; Li, Gen; Wang, Fei; Zhang, Changshui; Dey, Dipak K.; Chen, Kun: Leveraging mixed and incomplete outcomes via reduced-rank modeling (2018)
- Yang, Lei; Pong, Ting Kei; Chen, Xiaojun: A nonmonotone alternating updating method for a class of matrix factorization problems (2018)
- Bigot, Jérémie; Deledalle, Charles; Féral, Delphine: Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising (2017)
- Dutta, Aritra; Li, Xin: On a problem of weighted low-rank approximation of matrices (2017)
- Fithian, William; Josse, Julie: Multiple correspondence analysis and the multilogit bilinear model (2017)
- Josse, Julie; Wager, Stefan: Bootstrap-based regularization for low-rank matrix estimation (2016)
- Udell, Madeleine; Horn, Corinne; Zadeh, Reza; Boyd, Stephen: Generalized low rank models (2016)