Keras

Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras.io. Keras is compatible with: Python 2.7-3.5.


References in zbMATH (referenced in 89 articles )

Showing results 41 to 60 of 89.
Sorted by year (citations)
  1. Hettwer, Benjamin; Gehrer, Stefan; Güneysu, Tim: Profiled power analysis attacks using convolutional neural networks with domain knowledge (2019)
  2. Higham, Catherine F.; Higham, Desmond J.: Deep learning: an introduction for applied mathematicians (2019)
  3. Hsieh-Fu Tsai, Joanna Gajda, Tyler F.W. Sloan, Andrei Rares, Jason Ting-Chun Chou, Amy Q. Shen: Usiigaci: Instance-aware cell tracking in stain-free phase contrast microscopy enabled by machine learning (2019) not zbMATH
  4. Huan, Er-Yang; Wen, Gui-Hua: Multilevel and multiscale feature aggregation in deep networks for facial constitution classification (2019)
  5. Jonas Fassbender: libconform v0.1.0: a Python library for conformal prediction (2019) arXiv
  6. Khoo, Yuehaw; Ying, Lexing: SwitchNet: a neural Network Model for forward and inverse scattering problems (2019)
  7. Laloy, Eric; Jacques, Diederik: Emulation of CPU-demanding reactive transport models: a comparison of Gaussian processes, polynomial chaos expansion, and deep neural networks (2019)
  8. Maeland, Steffen; Strümke, Inga: Deep learning with periodic features and applications in particle physics (2019)
  9. Mäkinen, Ymir; Kanniainen, Juho; Gabbouj, Moncef; Iosifidis, Alexandros: Forecasting jump arrivals in stock prices: new attention-based network architecture using limit order book data (2019)
  10. Prasse, Paul; Knaebel, René; Machlica, Lukáš; Pevný, Tomáš; Scheffer, Tobias: Joint detection of malicious domains and infected clients (2019)
  11. Ramasubramanian, Karthik; Singh, Abhishek: Machine learning using R. With time series and industry-based use cases in R (2019)
  12. Sosnovik, Ivan; Oseledets, Ivan: Neural networks for topology optimization (2019)
  13. Syrlybaeva, Raulia R.; Talipov, Marat R.: CBSF: a new empirical scoring function for docking parameterized by weights of neural network (2019)
  14. Szymański, Piotr; Kajdanowicz, Tomasz: scikit-multilearn: a scikit-based Python environment for performing multi-label classification (2019)
  15. Tahir, Muhammad; Tayara, Hilal; Chong, Kil To: iRNA-PseKNC(2methyl): identify RNA 2’-O-methylation sites by convolution neural network and Chou’s pseudo components (2019)
  16. Tripp, Bryan: Approximating the architecture of visual cortex in a convolutional network (2019)
  17. Tubiana, Jérôme; Cocco, Simona; Monasson, Rémi: Learning compositional representations of interacting systems with restricted Boltzmann machines: comparative study of lattice proteins (2019)
  18. Viktor Kazakov, Franz J. Király: Machine Learning Automation Toolbox (MLaut) (2019) arXiv
  19. Wang, Kun; Sun, WaiChing: Meta-modeling game for deriving theory-consistent, microstructure-based traction-separation laws via deep reinforcement learning (2019)
  20. Xiaomeng Dong, Junpyo Hong, Hsi-Ming Chang, Michael Potter, Aritra Chowdhury, Purujit Bahl, Vivek Soni, Yun-Chan Tsai, Rajesh Tamada, Gaurav Kumar, Caroline Favart, V. Ratna Saripalli, Gopal Avinash: FastEstimator: A Deep Learning Library for Fast Prototyping and Productization (2019) arXiv