Keras

Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras.io. Keras is compatible with: Python 2.7-3.5.


References in zbMATH (referenced in 120 articles )

Showing results 21 to 40 of 120.
Sorted by year (citations)
  1. Binois, Mickael; Picheny, Victor; Taillandier, Patrick; Habbal, Abderrahmane: The Kalai-Smorodinsky solution for many-objective Bayesian optimization (2020)
  2. Breger, A.; Orlando, J. I.; Harar, P.; Dörfler, M.; Klimscha, S.; Grechenig, C.; Gerendas, B. S.; Schmidt-Erfurth, U.; Ehler, M.: On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (2020)
  3. Cheung, Siu Wun; Chung, Eric T.; Efendiev, Yalchin; Gildin, Eduardo; Wang, Yating; Zhang, Jingyan: Deep global model reduction learning in porous media flow simulation (2020)
  4. Chunggi Lee, Sanghoon Kim, Dongyun Han, Hongjun Yang, Young-Woo Park, Bum Chul Kwon, Sungahn Ko: GUIComp: A GUI Design Assistant with Real-Time, Multi-Faceted Feedback (2020) arXiv
  5. Fan, Yuwei; Ying, Lexing: Solving electrical impedance tomography with deep learning (2020)
  6. Heider, Yousef; Wang, Kun; Sun, WaiChing: (\mathrmSO(3))-invariance of informed-graph-based deep neural network for anisotropic elastoplastic materials (2020)
  7. Hottung, André; Tanaka, Shunji; Tierney, Kevin: Deep learning assisted heuristic tree search for the container pre-marshalling problem (2020)
  8. Hueber, Thomas; Tatulli, Eric; Girin, Laurent; Schwartz, Jean-Luc: Evaluating the potential gain of auditory and audiovisual speech-predictive coding using deep learning (2020)
  9. Hughes, Mark C.: A neural network approach to predicting and computing knot invariants (2020)
  10. Hu, Ruimeng: Deep learning for ranking response surfaces with applications to optimal stopping problems (2020)
  11. Kalina, Jan; Vidnerová, Petra: Regression neural networks with a highly robust loss function (2020)
  12. Karumuri, Sharmila; Tripathy, Rohit; Bilionis, Ilias; Panchal, Jitesh: Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks (2020)
  13. Kharrat, Tarak; McHale, Ian G.; Peña, Javier López: Plus-minus player ratings for soccer (2020)
  14. Korshunova, Nina; Jomo, J.; Lékó, G.; Reznik, D.; Balázs, P.; Kollmannsberger, S.: Image-based material characterization of complex microarchitectured additively manufactured structures (2020)
  15. Lejeune, Emma; Linder, Christian: Interpreting stochastic agent-based models of cell death (2020)
  16. Liberti, Leo: Distance geometry and data science (2020)
  17. Lukas Geiger; Plumerai Team: Larq: An Open-Source Library for Training Binarized Neural Networks (2020) not zbMATH
  18. Lye, Kjetil O.; Mishra, Siddhartha; Ray, Deep: Deep learning observables in computational fluid dynamics (2020)
  19. Meister, Felix; Passerini, Tiziano; Mihalef, Viorel; Tuysuzoglu, Ahmet; Maier, Andreas; Mansi, Tommaso: Deep learning acceleration of total Lagrangian explicit dynamics for soft tissue mechanics (2020)
  20. Osman, Yousuf Babiker M.; Li, Wei: Soft sensor modeling of key effluent parameters in wastewater treatment process based on SAE-NN (2020)