word2vec

This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..


References in zbMATH (referenced in 128 articles )

Showing results 1 to 20 of 128.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Benjamin Paaßen, Jessica McBroom, Bryn Jeffries, Irena Koprinska, Kalina Yacef: ast2vec: Utilizing Recursive Neural Encodings of Python Programs (2021) arXiv
  2. Zhu, Yuanyuan; Hu, Bin; Chen, Lei; Dai, Qi: iMPTCE-Hnetwork: a multilabel classifier for identifying metabolic pathway types of chemicals and enzymes with a heterogeneous network (2021)
  3. Agrawal, Devanshu; Papamarkou, Theodore; Hinkle, Jacob: Wide neural networks with bottlenecks are deep Gaussian processes (2020)
  4. Aryal, Sunil; Ting, Kai Ming; Washio, Takashi; Haffari, Gholamreza: A comparative study of data-dependent approaches without learning in measuring similarities of data objects (2020)
  5. Baechler, Gilles; Dümbgen, Frederike; Elhami, Golnoosh; Kreković, Miranda; Vetterli, Martin: Coordinate difference matrices (2020)
  6. Bassu, Devasis; Jones, Peter W.; Ness, Linda; Shallcross, David: Product formalisms for measures on spaces with binary tree structures: representation, visualization, and multiscale noise (2020)
  7. Chang, Haw-Shiuan; Vembu, Shankar; Mohan, Sunil; Uppaal, Rheeya; McCallum, Andrew: Using error decay prediction to overcome practical issues of deep active learning for named entity recognition (2020)
  8. Chen, Yiqi; Qian, Tieyun: Relation constrained attributed network embedding (2020)
  9. Derbanosov, R. Yu.; Irkhin, I. A.: Issues of stability and uniqueness of stochastic matrix factorization (2020)
  10. Fangzhou Xie: Pruned Wasserstein Index Generation Model and wigpy Package (2020) arXiv
  11. Fürnkranz, Johannes; Kliegr, Tomáš; Paulheim, Heiko: On cognitive preferences and the plausibility of rule-based models (2020)
  12. Grigorieva, Elena Gennadievna; Klyachin, Vladimir Aleksandrovich: The study of the statistical characteristics of the text based on the graph model of the linguistic corpus (2020)
  13. Hoiles, William; Krishnamurthy, Vikram; Pattanayak, Kunal: Rationally inattentive inverse reinforcement learning explains YouTube commenting behavior (2020)
  14. Ito, Tomoki; Tsubouchi, Kota; Sakaji, Hiroki; Yamashita, Tatsuo; Izumi, Kiyoshi: Concept cloud-based sentiment visualization for financial reviews (2020)
  15. Jia, Chengfeng; Han, Hua; Lv, Ya’nan; Zhang, Lu: Link prediction algorithm based on Word2vec and particle swarm (2020)
  16. Jiang, Zilong; Gao, Shu; Chen, Liangchen: Study on text representation method based on deep learning and topic information (2020)
  17. Juda, Mateusz: Unsupervised features learning for sampled vector fields (2020)
  18. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  19. Khandagale, Sujay; Xiao, Han; Babbar, Rohit: Bonsai: diverse and shallow trees for extreme multi-label classification (2020)
  20. Kreĭnes, M. G.; Kreĭnes, Elena M.: Matrix text models. Text corpora models (2020)

1 2 3 ... 5 6 7 next