Tensor2Tensor

Tensor2Tensor, or T2T for short, is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. T2T is actively used and maintained by researchers and engineers within the Google Brain team and a community of users. We’re eager to collaborate with you too, so feel free to open an issue on GitHub or send along a pull request (see our contribution doc). You can chat with us on Gitter and join the T2T Google Group.


References in zbMATH (referenced in 91 articles )

Showing results 61 to 80 of 91.
Sorted by year (citations)
  1. Yin, Yongjing; Lai, Shaopeng; Song, Linfeng; Zhou, Chulun; Han, Xianpei; Yao, Junfeng; Su, Jinsong: An external knowledge enhanced graph-based neural network for sentence ordering (2021)
  2. Zhang, Meishan; Li, Zhenghua; Fu, Guohong; Zhang, Min: Dependency-based syntax-aware word representations (2021)
  3. Zhou, Yirong; Li, Jun; Chen, Hao; Wu, Ye; Wu, Jiangjiang; Chen, Luo: A spatiotemporal hierarchical attention mechanism-based model for multi-step station-level crowd flow prediction (2021)
  4. Arik, Sercan O.; Pfister, Tomas: ProtoAttend: attention-based prototypical learning (2020)
  5. Bacciu, Davide; Errica, Federico; Micheli, Alessio; Podda, Marco: A gentle introduction to deep learning for graphs (2020)
  6. Bloem-Reddy, Benjamin; Teh, Yee Whye: Probabilistic symmetries and invariant neural networks (2020)
  7. Fang, Jie; Lin, Jianwu; Xia, Shutao; Xia, Zhikang; Hu, Shenglei; Liu, Xiang; Jiang, Yong: Neural network-based automatic factor construction (2020)
  8. Feliu-Fabà, Jordi; Fan, Yuwei; Ying, Lexing: Meta-learning pseudo-differential operators with deep neural networks (2020)
  9. Fernandes, Bruno; Silva, Fabio; Alaiz-Moreton, Hector; Novais, Paulo; Neves, Jose; Analide, Cesar: Long short-term memory networks for traffic flow forecasting: exploring input variables, time frames and multi-step approaches (2020)
  10. Frady, E. Paxon; Kent, Spencer J.; Olshausen, Bruno A.; Sommer, Friedrich T.: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures (2020)
  11. Geneva, Nicholas; Zabaras, Nicholas: Modeling the dynamics of PDE systems with physics-constrained deep auto-regressive networks (2020)
  12. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  13. Lang, Xufeng; Sun, Zhengxing: Structure-aware shape correspondence network for 3D shape synthesis (2020)
  14. Liu, Li; Ouyang, Wanli; Wang, Xiaogang; Fieguth, Paul; Chen, Jie; Liu, Xinwang; Pietikäinen, Matti: Deep learning for generic object detection: a survey (2020)
  15. Sangiorgio, Matteo; Dercole, Fabio: Robustness of LSTM neural networks for multi-step forecasting of chaotic time series (2020)
  16. Sun, Ruo-Yu: Optimization for deep learning: an overview (2020)
  17. Tikhomirov, M. M.; Loukachevitch, N. V.; Dobrov, B. V.: Recognizing named entities in specific domain (2020)
  18. Wang, Chen; Xu, Li-yan; Fan, Jian-sheng: A general deep learning framework for history-dependent response prediction based on UA-Seq2Seq model (2020)
  19. Wang, Shirui; Zhou, Wenan; Jiang, Chao: A survey of word embeddings based on deep learning (2020)
  20. Wan, Qian; Liu, Jie; Wei, Luona; Ji, Bin: A self-attention based neural architecture for Chinese medical named entity recognition (2020)