Keras

Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras.io. Keras is compatible with: Python 2.7-3.5.


References in zbMATH (referenced in 72 articles )

Showing results 1 to 20 of 72.
Sorted by year (citations)

1 2 3 4 next

  1. Ali Shahin Shamsabadi, Adria Gascon, Hamed Haddadi, Andrea Cavallaro: PrivEdge: From Local to Distributed Private Training and Prediction (2020) arXiv
  2. Breger, A.; Orlando, J. I.; Harar, P.; Dörfler, M.; Klimscha, S.; Grechenig, C.; Gerendas, B. S.; Schmidt-Erfurth, U.; Ehler, M.: On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (2020)
  3. Cheung, Siu Wun; Chung, Eric T.; Efendiev, Yalchin; Gildin, Eduardo; Wang, Yating; Zhang, Jingyan: Deep global model reduction learning in porous media flow simulation (2020)
  4. Heider, Yousef; Wang, Kun; Sun, WaiChing: SO(3)-invariance of informed-graph-based deep neural network for anisotropic elastoplastic materials (2020)
  5. Hottung, André; Tanaka, Shunji; Tierney, Kevin: Deep learning assisted heuristic tree search for the container pre-marshalling problem (2020)
  6. Hughes, Mark C.: A neural network approach to predicting and computing knot invariants (2020)
  7. Kharrat, Tarak; McHale, Ian G.; Peña, Javier López: Plus-minus player ratings for soccer (2020)
  8. Lejeune, Emma; Linder, Christian: Interpreting stochastic agent-based models of cell death (2020)
  9. Liberti, Leo: Distance geometry and data science (2020)
  10. Liu, Peng; Song, Yan: Segmentation of sonar imagery using convolutional neural networks and Markov random field (2020)
  11. Lukas Geiger; Plumerai Team: Larq: An Open-Source Library for Training Binarized Neural Networks (2020) not zbMATH
  12. Meister, Felix; Passerini, Tiziano; Mihalef, Viorel; Tuysuzoglu, Ahmet; Maier, Andreas; Mansi, Tommaso: Deep learning acceleration of total Lagrangian explicit dynamics for soft tissue mechanics (2020)
  13. Palagi, Laura; Seccia, Ruggiero: Block layer decomposition schemes for training deep neural networks (2020)
  14. Parish, Eric J.; Carlberg, Kevin T.: Time-series machine-learning error models for approximate solutions to parameterized dynamical systems (2020)
  15. P.E. Hadjidoukas, A. Bartezzaghi, F. Scheidegger, R. Istrate, C.Bekas, A.C.I. Malossi: torcpy: Supporting task parallelism in Python (2020) not zbMATH
  16. Ruehle, Fabian: Data science applications to string theory (2020)
  17. Willmott, Devin; Murrugarra, David; Ye, Qiang: Improving RNA secondary structure prediction via state inference with deep recurrent neural networks (2020)
  18. Ariafar, Setareh; Coll-Font, Jaume; Brooks, Dana; Dy, Jennifer: ADMMBO: Bayesian optimization with unknown constraints using ADMM (2019)
  19. Arridge, Simon; Maass, Peter; Öktem, Ozan; Schönlieb, Carola-Bibiane: Solving inverse problems using data-driven models (2019)
  20. Balakrishnan, Harikrishnan Nellippallil; Kathpalia, Aditi; Saha, Snehanshu; Nagaraj, Nithin: Chaosnet: a chaos based artificial neural network architecture for classification (2019)

1 2 3 4 next