word2vec

This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..


References in zbMATH (referenced in 88 articles )

Showing results 1 to 20 of 88.
Sorted by year (citations)

1 2 3 4 5 next

  1. Baechler, Gilles; Dümbgen, Frederike; Elhami, Golnoosh; Kreković, Miranda; Vetterli, Martin: Coordinate difference matrices (2020)
  2. Fangzhou Xie: Pruned Wasserstein Index Generation Model and wigpy Package (2020) arXiv
  3. Fürnkranz, Johannes; Kliegr, Tomáš; Paulheim, Heiko: On cognitive preferences and the plausibility of rule-based models (2020)
  4. Ito, Tomoki; Tsubouchi, Kota; Sakaji, Hiroki; Yamashita, Tatsuo; Izumi, Kiyoshi: Concept cloud-based sentiment visualization for financial reviews (2020)
  5. Kreĭnes, M. G.; Kreĭnes, Elena M.: Matrix text models. Text corpora models (2020)
  6. Kreĭnes, M. G.; Kreĭnes, Elena M.: Matrix text models. Text models and similarity of text contents (2020)
  7. Lee, Gee Y.; Manski, Scott; Maiti, Tapabrata: Actuarial applications of word embedding models (2020)
  8. Lee, O-Joun; Jung, Jason J.: Story embedding: learning distributed representations of stories based on character networks (2020)
  9. Liberti, Leo: Distance geometry and data science (2020)
  10. Li, Dandan; Summers-Stay, Douglas: Dual embeddings and metrics for word and relational similarity (2020)
  11. Nguyen, Thi Thanh Sang; Do, Pham Minh Thu: Classification optimization for training a large dataset with naïve Bayes (2020)
  12. Pasini, Tommaso; Navigli, Roberto: Train-o-matic: supervised word sense disambiguation with no (manual) effort (2020)
  13. Pio, Gianvito; Ceci, Michelangelo; Prisciandaro, Francesca; Malerba, Donato: Exploiting causality in gene network reconstruction based on graph embedding (2020)
  14. Ruiz, Francisco J. R.; Athey, Susan; Blei, David M.: SHOPPER: a probabilistic model of consumer choice with substitutes and complements (2020)
  15. Simpson, Edwin; Gurevych, Iryna: Scalable Bayesian preference learning for crowds (2020)
  16. Skelac, Ines; Jandrić, Andrej: Meaning as use: from Wittgenstein to Google’s word2vec (2020)
  17. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)
  18. Vanzo, Andrea; Croce, Danilo; Bastianelli, Emanuele; Basili, Roberto; Nardi, Daniele: Grounded language interpretation of robotic commands through structured learning (2020)
  19. Veale, Tony: Changing channels: divergent approaches to the creative streaming of texts (2020)
  20. Xie, Fangzhou: Wasserstein index generation model: automatic generation of time-series index with application to economic policy uncertainty (2020)

1 2 3 4 5 next