This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..

References in zbMATH (referenced in 128 articles )

Showing results 1 to 20 of 128.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Abheesht Sharma, Gunjan Chhablani, Harshit Pandey, Rajaswa Patil: DRIFT: A Toolkit for Diachronic Analysis of Scientific Literature (2021) arXiv
  2. Benjamin Paaßen, Jessica McBroom, Bryn Jeffries, Irena Koprinska, Kalina Yacef: ast2vec: Utilizing Recursive Neural Encodings of Python Programs (2021) arXiv
  3. Justin Shenk, Wolf Byttner, Saranraj Nambusubramaniyan, Alexander Zoeller: Traja: A Python toolbox for animal trajectory analysis (2021) not zbMATH
  4. Zhu, Yuanyuan; Hu, Bin; Chen, Lei; Dai, Qi: iMPTCE-Hnetwork: a multilabel classifier for identifying metabolic pathway types of chemicals and enzymes with a heterogeneous network (2021)
  5. Agrawal, Devanshu; Papamarkou, Theodore; Hinkle, Jacob: Wide neural networks with bottlenecks are deep Gaussian processes (2020)
  6. Aryal, Sunil; Ting, Kai Ming; Washio, Takashi; Haffari, Gholamreza: A comparative study of data-dependent approaches without learning in measuring similarities of data objects (2020)
  7. Baechler, Gilles; Dümbgen, Frederike; Elhami, Golnoosh; Kreković, Miranda; Vetterli, Martin: Coordinate difference matrices (2020)
  8. Bassu, Devasis; Jones, Peter W.; Ness, Linda; Shallcross, David: Product formalisms for measures on spaces with binary tree structures: representation, visualization, and multiscale noise (2020)
  9. Chang, Haw-Shiuan; Vembu, Shankar; Mohan, Sunil; Uppaal, Rheeya; McCallum, Andrew: Using error decay prediction to overcome practical issues of deep active learning for named entity recognition (2020)
  10. Chen, Yiqi; Qian, Tieyun: Relation constrained attributed network embedding (2020)
  11. Derbanosov, R. Yu.; Irkhin, I. A.: Issues of stability and uniqueness of stochastic matrix factorization (2020)
  12. Fangzhou Xie: Pruned Wasserstein Index Generation Model and wigpy Package (2020) arXiv
  13. Fürnkranz, Johannes; Kliegr, Tomáš; Paulheim, Heiko: On cognitive preferences and the plausibility of rule-based models (2020)
  14. Grigorieva, Elena Gennadievna; Klyachin, Vladimir Aleksandrovich: The study of the statistical characteristics of the text based on the graph model of the linguistic corpus (2020)
  15. Hoiles, William; Krishnamurthy, Vikram; Pattanayak, Kunal: Rationally inattentive inverse reinforcement learning explains YouTube commenting behavior (2020)
  16. Interdonato, Roberto; Magnani, Matteo; Perna, Diego; Tagarelli, Andrea; Vega, Davide: Multilayer network simplification: approaches, models and methods (2020)
  17. Ito, Tomoki; Tsubouchi, Kota; Sakaji, Hiroki; Yamashita, Tatsuo; Izumi, Kiyoshi: Concept cloud-based sentiment visualization for financial reviews (2020)
  18. Jia, Chengfeng; Han, Hua; Lv, Ya’nan; Zhang, Lu: Link prediction algorithm based on Word2vec and particle swarm (2020)
  19. Jiang, Zilong; Gao, Shu; Chen, Liangchen: Study on text representation method based on deep learning and topic information (2020)
  20. Juda, Mateusz: Unsupervised features learning for sampled vector fields (2020)

1 2 3 ... 5 6 7 next