Theano

Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Theano features tight integration with numpy, transparent use of a GPU, efficient symbolic differentiation, speed and stability optimizations, dynamic C code generation, and extensive unit-testing and self-verification. Theano has been powering large-scale computationally intensive scientific investigations since 2007. But it is also approachable enough to be used in the classroom (IFT6266 at the University of Montreal). (Source: http://freecode.com/)


References in zbMATH (referenced in 74 articles )

Showing results 1 to 20 of 74.
Sorted by year (citations)

1 2 3 4 next

  1. Alvaro Tejero-Canteroe; Jan Boeltse; Michael Deistlere; Jan-Matthis Lueckmanne; Conor Durkane; Pedro J. Gonçalves; David S. Greenberg; Jakob H. Macke: sbi: A toolkit for simulation-based inference (2020) not zbMATH
  2. Bloem-Reddy, Benjamin; Teh, Yee Whye: Probabilistic symmetries and invariant neural networks (2020)
  3. Cohen, William; Yang, Fan; Mazaitis, Kathryn Rivard: TensorLog: a probabilistic database implemented using deep-learning infrastructure (2020)
  4. Duarte, Victor; Duarte, Diogo; Fonseca, Julia; Montecinos, Alexis: Benchmarking machine-learning software and hardware for quantitative economics (2020)
  5. Guo, Jian; He, He; He, Tong; Lausen, Leonard; Li, Mu; Lin, Haibin; Shi, Xingjian; Wang, Chenguang; Xie, Junyuan; Zha, Sheng; Zhang, Aston; Zhang, Hang; Zhang, Zhi; Zhang, Zhongyue; Zheng, Shuai; Zhu, Yi: GluonCV and GluonNLP: deep learning in computer vision and natural language processing (2020)
  6. Hottung, André; Tanaka, Shunji; Tierney, Kevin: Deep learning assisted heuristic tree search for the container pre-marshalling problem (2020)
  7. Hughes, Mark C.: A neural network approach to predicting and computing knot invariants (2020)
  8. Joshua G. Albert: JAXNS: a high-performance nested sampling package based on JAX (2020) arXiv
  9. Katrutsa, Alexandr; Daulbaev, Talgat; Oseledets, Ivan: Black-box learning of multigrid parameters (2020)
  10. Reizenstein, Jeremy F.; Graham, Benjamin: Algorithm 1004: The iisignature library: efficient calculation of iterated-integral signatures and log signatures (2020)
  11. René, Alexandre; Longtin, André; Macke, Jakob H.: Inference of a mesoscopic population model from population spike trains (2020)
  12. Škrlj, Blaž; Kralj, Jan; Lavrač, Nada: Embedding-based silhouette community detection (2020)
  13. Sun, Luning; Gao, Han; Pan, Shaowu; Wang, Jian-Xun: Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data (2020)
  14. Tobias Stål, Anya M. Reading: A Grid for Multidimensional and Multivariate Spatial Representation and Data Processing (2020) not zbMATH
  15. Tomás Capretto, Camen Piho, Ravin Kumar, Jacob Westfall, Tal Yarkoni, Osvaldo A. Martin: Bambi: A simple interface for fitting Bayesian linear models in Python (2020) arXiv
  16. Willmott, Devin; Murrugarra, David; Ye, Qiang: Improving RNA secondary structure prediction via state inference with deep recurrent neural networks (2020)
  17. Arnaudon, Alexis; Holm, Darryl D.; Sommer, Stefan: A geometric framework for stochastic shape analysis (2019)
  18. Bonilla, Edwin V.; Krauth, Karl; Dezfouli, Amir: Generic inference in latent Gaussian process models (2019)
  19. Cox, Marco; van de Laar, Thijs; de Vries, Bert: A factor graph approach to automated design of Bayesian signal processing algorithms (2019)
  20. Edgar Riba, Dmytro Mishkin, Daniel Ponsa, Ethan Rublee, Gary Bradski: Kornia: an Open Source Differentiable Computer Vision Library for PyTorch (2019) arXiv

1 2 3 4 next