Tensor2Tensor

Tensor2Tensor, or T2T for short, is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. T2T is actively used and maintained by researchers and engineers within the Google Brain team and a community of users. We’re eager to collaborate with you too, so feel free to open an issue on GitHub or send along a pull request (see our contribution doc). You can chat with us on Gitter and join the T2T Google Group.


References in zbMATH (referenced in 13 articles )

Showing results 1 to 13 of 13.
Sorted by year (citations)

  1. Bloem-Reddy, Benjamin; Teh, Yee Whye: Probabilistic symmetries and invariant neural networks (2020)
  2. Frady, E. Paxon; Kent, Spencer J.; Olshausen, Bruno A.; Sommer, Friedrich T.: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures (2020)
  3. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  4. Lang, Xufeng; Sun, Zhengxing: Structure-aware shape correspondence network for 3D shape synthesis (2020)
  5. Tikhomirov, M. M.; Loukachevitch, N. V.; Dobrov, B. V.: Recognizing named entities in specific domain (2020)
  6. Ye, Han-Jia; Sheng, Xiang-Rong; Zhan, De-Chuan: Few-shot learning with adaptively initialized task optimizer: a practical meta-learning approach (2020)
  7. Chen, Shun; Ge, Lei: Exploring the attention mechanism in LSTM-based Hong Kong stock price movement prediction (2019)
  8. Su, Jinsong; Zhang, Xiangwen; Lin, Qian; Qin, Yue; Yao, Junfeng; Liu, Yang: Exploiting reverse target-side contexts for neural machine translation via asynchronous bidirectional decoding (2019)
  9. Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, Jamie Brew: HuggingFace’s Transformers: State-of-the-art Natural Language Processing (2019) arXiv
  10. Albert Zeyer, Tamer Alkhouli, Hermann Ney: RETURNN as a Generic Flexible Neural Toolkit with Application to Translation and Speech Recognition (2018) arXiv
  11. Oleksii Kuchaiev; Boris Ginsburg; Igor Gitman; Vitaly Lavrukhin; Carl Case; Paulius Micikevicius: OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models (2018) arXiv
  12. Xiaolin Wang; Masao Utiyama; Eiichiro Sumita: CytonMT: an Efficient Neural Machine Translation Open-source Toolkit Implemented in C++ (2018) arXiv
  13. Zhiting Hu; Haoran Shi; Zichao Yang; Bowen Tan; Tiancheng Zhao; Junxian He; Wentao Wang; Xingjiang Yu; Lianhui Qin; Di Wang; Xuezhe Ma; Hector Liu; Xiaodan Liang; Wanrong Zhu; Devendra Singh Sachan; Eric P. Xing: Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation (2018) arXiv