Keras

Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras.io. Keras is compatible with: Python 2.7-3.5.


References in zbMATH (referenced in 85 articles )

Showing results 1 to 20 of 85.
Sorted by year (citations)

1 2 3 4 5 next

  1. Ali Shahin Shamsabadi, Adria Gascon, Hamed Haddadi, Andrea Cavallaro: PrivEdge: From Local to Distributed Private Training and Prediction (2020) arXiv
  2. Breger, A.; Orlando, J. I.; Harar, P.; Dörfler, M.; Klimscha, S.; Grechenig, C.; Gerendas, B. S.; Schmidt-Erfurth, U.; Ehler, M.: On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (2020)
  3. Cheung, Siu Wun; Chung, Eric T.; Efendiev, Yalchin; Gildin, Eduardo; Wang, Yating; Zhang, Jingyan: Deep global model reduction learning in porous media flow simulation (2020)
  4. Gakhar, Saksham; Koseff, Jeffrey R.; Ouellette, Nicholas T.: On the surface expression of bottom features in free-surface flow (2020)
  5. Heider, Yousef; Wang, Kun; Sun, WaiChing: (\mathrmSO(3))-invariance of informed-graph-based deep neural network for anisotropic elastoplastic materials (2020)
  6. Hottung, André; Tanaka, Shunji; Tierney, Kevin: Deep learning assisted heuristic tree search for the container pre-marshalling problem (2020)
  7. Hughes, Mark C.: A neural network approach to predicting and computing knot invariants (2020)
  8. Kharrat, Tarak; McHale, Ian G.; Peña, Javier López: Plus-minus player ratings for soccer (2020)
  9. Lejeune, Emma; Linder, Christian: Interpreting stochastic agent-based models of cell death (2020)
  10. Liberti, Leo: Distance geometry and data science (2020)
  11. Liu, Peng; Song, Yan: Segmentation of sonar imagery using convolutional neural networks and Markov random field (2020)
  12. Lukas Geiger; Plumerai Team: Larq: An Open-Source Library for Training Binarized Neural Networks (2020) not zbMATH
  13. Lye, Kjetil O.; Mishra, Siddhartha; Ray, Deep: Deep learning observables in computational fluid dynamics (2020)
  14. Meister, Felix; Passerini, Tiziano; Mihalef, Viorel; Tuysuzoglu, Ahmet; Maier, Andreas; Mansi, Tommaso: Deep learning acceleration of total Lagrangian explicit dynamics for soft tissue mechanics (2020)
  15. Osman, Yousuf Babiker M.; Li, Wei: Soft sensor modeling of key effluent parameters in wastewater treatment process based on SAE-NN (2020)
  16. Palagi, Laura; Seccia, Ruggiero: Block layer decomposition schemes for training deep neural networks (2020)
  17. Parish, Eric J.; Carlberg, Kevin T.: Time-series machine-learning error models for approximate solutions to parameterized dynamical systems (2020)
  18. P.E. Hadjidoukas, A. Bartezzaghi, F. Scheidegger, R. Istrate, C.Bekas, A.C.I. Malossi: torcpy: Supporting task parallelism in Python (2020) not zbMATH
  19. Ruehle, Fabian: Data science applications to string theory (2020)
  20. Shahriari, M.; Pardo, D.; Picon, A.; Galdran, A.; Del Ser, J.; Torres-Verdín, C.: A deep learning approach to the inversion of borehole resistivity measurements (2020)

1 2 3 4 5 next