word2vec

This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..


References in zbMATH (referenced in 94 articles )

Showing results 1 to 20 of 94.
Sorted by year (citations)

1 2 3 4 5 next

  1. Baechler, Gilles; Dümbgen, Frederike; Elhami, Golnoosh; Kreković, Miranda; Vetterli, Martin: Coordinate difference matrices (2020)
  2. Fangzhou Xie: Pruned Wasserstein Index Generation Model and wigpy Package (2020) arXiv
  3. Fürnkranz, Johannes; Kliegr, Tomáš; Paulheim, Heiko: On cognitive preferences and the plausibility of rule-based models (2020)
  4. Grigorieva, Elena Gennadievna; Klyachin, Vladimir Aleksandrovich: The study of the statistical characteristics of the text based on the graph model of the linguistic corpus (2020)
  5. Ito, Tomoki; Tsubouchi, Kota; Sakaji, Hiroki; Yamashita, Tatsuo; Izumi, Kiyoshi: Concept cloud-based sentiment visualization for financial reviews (2020)
  6. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  7. Kreĭnes, M. G.; Kreĭnes, Elena M.: Matrix text models. Text corpora models (2020)
  8. Kreĭnes, M. G.; Kreĭnes, Elena M.: Matrix text models. Text models and similarity of text contents (2020)
  9. Lavrač, Nada; Škrlj, Blaž; Robnik-Šikonja, Marko: Propositionalization and embeddings: two sides of the same coin (2020)
  10. Lee, Gee Y.; Manski, Scott; Maiti, Tapabrata: Actuarial applications of word embedding models (2020)
  11. Lee, O-Joun; Jung, Jason J.: Story embedding: learning distributed representations of stories based on character networks (2020)
  12. Liberti, Leo: Distance geometry and data science (2020)
  13. Li, Dandan; Summers-Stay, Douglas: Dual embeddings and metrics for word and relational similarity (2020)
  14. Nguyen, Thi Thanh Sang; Do, Pham Minh Thu: Classification optimization for training a large dataset with naïve Bayes (2020)
  15. Pasini, Tommaso; Navigli, Roberto: Train-o-matic: supervised word sense disambiguation with no (manual) effort (2020)
  16. Pio, Gianvito; Ceci, Michelangelo; Prisciandaro, Francesca; Malerba, Donato: Exploiting causality in gene network reconstruction based on graph embedding (2020)
  17. Ruiz, Francisco J. R.; Athey, Susan; Blei, David M.: SHOPPER: a probabilistic model of consumer choice with substitutes and complements (2020)
  18. Samanta, Bidisha; De, Abir; Jana, Gourhari; Gómez, Vicenç; Chattaraj, Pratim; Ganguly, Niloy; Gomez-Rodriguez, Manuel: \textscNeVAE: a deep generative model for molecular graphs (2020)
  19. Simpson, Edwin; Gurevych, Iryna: Scalable Bayesian preference learning for crowds (2020)
  20. Skelac, Ines; Jandrić, Andrej: Meaning as use: from Wittgenstein to Google’s Word2vec (2020)

1 2 3 4 5 next