LIBLINEAR is an open source library for large-scale linear classification. It supports logistic regression and linear support vector machines. We provide easy-to-use command-line tools and library calls for users and developers. Comprehensive documents are available for both beginners and advanced users. Experiments demonstrate that LIBLINEAR is very efficient on large sparse data sets.

References in zbMATH (referenced in 127 articles , 1 standard article )

Showing results 1 to 20 of 127.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
  2. Pensar, Johan; Xu, Yingying; Puranen, Santeri; Pesonen, Maiju; Kabashima, Yoshiyuki; Corander, Jukka: High-dimensional structure learning of binary pairwise Markov networks: a comparative numerical study (2020)
  3. Po-Hsien Huang: lslx: Semi-Confirmatory Structural Equation Modeling via Penalized Likelihood (2020) not zbMATH
  4. Vanzo, Andrea; Croce, Danilo; Bastianelli, Emanuele; Basili, Roberto; Nardi, Daniele: Grounded language interpretation of robotic commands through structured learning (2020)
  5. Wu, Guoqiang; Zheng, Ruobing; Tian, Yingjie; Liu, Dalian: Joint ranking SVM and binary relevance with robust low-rank learning for multi-label classification (2020)
  6. Amir M. Mir; Jalal A. Nasiri: LightTwinSVM: A Simple and Fast Implementation of Standard Twin Support Vector Machine Classifier (2019) not zbMATH
  7. Babbar, Rohit; Schölkopf, Bernhard: Data scarcity, robustness and extreme multi-label classification (2019)
  8. Chvalovský, Karel; Jakubův, Jan; Suda, Martin; Urban, Josef: ENIGMA-NG: efficient neural and gradient-boosted inference guidance for (\mathrmE) (2019)
  9. Hong, Bin; Zhang, Weizhong; Liu, Wei; Ye, Jieping; Cai, Deng; He, Xiaofei; Wang, Jie: Scaling up sparse support vector machines by simultaneous feature and sample reduction (2019)
  10. Liu, Jiapeng; Liao, Xiuwu; Kadziński, Miłosz; Słowiński, Roman: Preference disaggregation within the regularization framework for sorting problems with multiple potentially non-monotonic criteria (2019)
  11. Možina, Martin; Demšar, Janez; Bratko, Ivan; Žabkar, Jure: Extreme value correction: a method for correcting optimistic estimations in rule learning (2019)
  12. Sadrfaridpour, Ehsan; Razzaghi, Talayeh; Safro, Ilya: Engineering fast multilevel support vector machines (2019)
  13. Song, Yangqiu; Upadhyay, Shyam; Peng, Haoruo; Mayhew, Stephen; Roth, Dan: Toward any-language zero-shot topic classification of textual documents (2019)
  14. Vinod Kumar Chauhan, Anuj Sharma, Kalpana Dahiya: LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms (2019) arXiv
  15. Wang, Po-Wei; Lee, Ching-Pei; Lin, Chih-Jen: The common-directions method for regularized empirical risk minimization (2019)
  16. Yin, Juan; Li, Qingna: A semismooth Newton method for support vector classification and regression (2019)
  17. Yue, Man-Chung; Zhou, Zirui; So, Anthony Man-Cho: A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property (2019)
  18. Zablith, Fouad; Osman, Ibrahim H.: ReviewModus: text classification and sentiment prediction of unstructured reviews using a hybrid combination of machine learning and evaluation models (2019)
  19. Zhou, Joey Tianyi; Pan, Sinno Jialin; Tsang, Ivor W.: A deep learning framework for hybrid heterogeneous transfer learning (2019)
  20. Zhou, Joey Tianyi; Tsang, Ivor W.; Pan, Sinno Jialin; Tan, Mingkui: Multi-class heterogeneous domain adaptation (2019)

1 2 3 ... 5 6 7 next