LIBLINEAR

LIBLINEAR is an open source library for large-scale linear classification. It supports logistic regression and linear support vector machines. We provide easy-to-use command-line tools and library calls for users and developers. Comprehensive documents are available for both beginners and advanced users. Experiments demonstrate that LIBLINEAR is very efficient on large sparse data sets.


References in zbMATH (referenced in 132 articles , 1 standard article )

Showing results 1 to 20 of 132.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
  2. Khandagale, Sujay; Xiao, Han; Babbar, Rohit: Bonsai: diverse and shallow trees for extreme multi-label classification (2020)
  3. Pensar, Johan; Xu, Yingying; Puranen, Santeri; Pesonen, Maiju; Kabashima, Yoshiyuki; Corander, Jukka: High-dimensional structure learning of binary pairwise Markov networks: a comparative numerical study (2020)
  4. Po-Hsien Huang: lslx: Semi-Confirmatory Structural Equation Modeling via Penalized Likelihood (2020) not zbMATH
  5. Vanzo, Andrea; Croce, Danilo; Bastianelli, Emanuele; Basili, Roberto; Nardi, Daniele: Grounded language interpretation of robotic commands through structured learning (2020)
  6. Wang, Li; Zhang, Lei-hong; Bai, Zhaojun; Li, Ren-Cang: Orthogonal canonical correlation analysis and applications (2020)
  7. Wu, Guoqiang; Zheng, Ruobing; Tian, Yingjie; Liu, Dalian: Joint ranking SVM and binary relevance with robust low-rank learning for multi-label classification (2020)
  8. Yan, Yinqiao; Li, Qingna: An efficient augmented Lagrangian method for support vector machine (2020)
  9. Amir M. Mir; Jalal A. Nasiri: LightTwinSVM: A Simple and Fast Implementation of Standard Twin Support Vector Machine Classifier (2019) not zbMATH
  10. Babbar, Rohit; Schölkopf, Bernhard: Data scarcity, robustness and extreme multi-label classification (2019)
  11. Chvalovský, Karel; Jakubův, Jan; Suda, Martin; Urban, Josef: ENIGMA-NG: efficient neural and gradient-boosted inference guidance for (\mathrmE) (2019)
  12. Gorban, Alexander N.; Burton, Richard; Romanenko, Ilya; Tyukin, Ivan Yu.: One-trial correction of legacy AI systems and stochastic separation theorems (2019)
  13. Hong, Bin; Zhang, Weizhong; Liu, Wei; Ye, Jieping; Cai, Deng; He, Xiaofei; Wang, Jie: Scaling up sparse support vector machines by simultaneous feature and sample reduction (2019)
  14. Liu, Jiapeng; Liao, Xiuwu; Kadziński, Miłosz; Słowiński, Roman: Preference disaggregation within the regularization framework for sorting problems with multiple potentially non-monotonic criteria (2019)
  15. Možina, Martin; Demšar, Janez; Bratko, Ivan; Žabkar, Jure: Extreme value correction: a method for correcting optimistic estimations in rule learning (2019)
  16. Song, Yangqiu; Upadhyay, Shyam; Peng, Haoruo; Mayhew, Stephen; Roth, Dan: Toward any-language zero-shot topic classification of textual documents (2019)
  17. Vinod Kumar Chauhan, Anuj Sharma, Kalpana Dahiya: LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms (2019) arXiv
  18. Wang, Po-Wei; Lee, Ching-Pei; Lin, Chih-Jen: The common-directions method for regularized empirical risk minimization (2019)
  19. Yin, Juan; Li, Qingna: A semismooth Newton method for support vector classification and regression (2019)
  20. Yue, Man-Chung; Zhou, Zirui; So, Anthony Man-Cho: A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property (2019)

1 2 3 ... 5 6 7 next