TensorFlow

TensorFlow™ is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google’s Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.


References in zbMATH (referenced in 239 articles )

Showing results 1 to 20 of 239.
Sorted by year (citations)

1 2 3 ... 10 11 12 next

  1. Alexander M. Rush: Torch-Struct: Deep Structured Prediction Library (2020) arXiv
  2. Ali Shahin Shamsabadi, Adria Gascon, Hamed Haddadi, Andrea Cavallaro: PrivEdge: From Local to Distributed Private Training and Prediction (2020) arXiv
  3. Anderson, Ross; Huchette, Joey; Ma, Will; Tjandraatmadja, Christian; Vielma, Juan Pablo: Strong mixed-integer programming formulations for trained neural networks (2020)
  4. Arridge, S.; Hauptmann, A.: Networks for nonlinear diffusion problems in imaging (2020)
  5. Arun S. Maiya: ktrain: A Low-Code Library for Augmented Machine Learning (2020) arXiv
  6. Banert, Sebastian; Ringh, Axel; Adler, Jonas; Karlsson, Johan; Öktem, Ozan: Data-driven nonsmooth optimization (2020)
  7. Benedek Rozemberczki, Oliver Kiss, Rik Sarkar: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (2020) arXiv
  8. Biau, Gérard; Cadre, Benoît; Sangnier, Maxime; Tanielian, Ugo: Some theoretical properties of GANS (2020)
  9. Boehmke, Brad; Greenwell, Brandon M.: Hands-on machine learning with R (2020)
  10. Boso, Francesca; Tartakovsky, Daniel M.: Data-informed method of distributions for hyperbolic conservation laws (2020)
  11. Budninskiy, Max; Abdelaziz, Ameera; Tong, Yiying; Desbrun, Mathieu: Laplacian-optimized diffusion for semi-supervised learning (2020)
  12. Chaoyang He, Songze Li, Jinhyun So, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram, Salman Avestimehr: FedML: A Research Library and Benchmark for Federated Machine Learning (2020) arXiv
  13. Chris Cummins, Zacharias V. Fisches, Tal Ben-Nun, Torsten Hoefler, Hugh Leather: ProGraML: Graph-based Deep Learning for Program Optimization and Analysis (2020) arXiv
  14. Christoforou, Emmanouil; Emiris, Ioannis Z.; Florakis, Apostolos: Neural networks for cryptocurrency evaluation and price fluctuation forecasting (2020)
  15. Cocchi, G.; Liuzzi, G.; Lucidi, S.; Sciandrone, M.: On the convergence of steepest descent methods for multiobjective optimization (2020)
  16. Cohen, William; Yang, Fan; Mazaitis, Kathryn Rivard: TensorLog: a probabilistic database implemented using deep-learning infrastructure (2020)
  17. Cui, Ying; He, Ziyu; Pang, Jong-Shi: Multicomposite nonconvex optimization for training deep neural networks (2020)
  18. Davis, Damek; Drusvyatskiy, Dmitriy; Kakade, Sham; Lee, Jason D.: Stochastic subgradient method converges on tame functions (2020)
  19. Dillon Niederhut: niacin: A Python package for text data enrichment (2020) not zbMATH
  20. Dittmer, Sören; Kluth, Tobias; Maass, Peter; Otero Baguer, Daniel: Regularization by architecture: a deep prior approach for inverse problems (2020)

1 2 3 ... 10 11 12 next