Keras

Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras.io. Keras is compatible with: Python 2.7-3.5.


References in zbMATH (referenced in 89 articles )

Showing results 61 to 80 of 89.
Sorted by year (citations)
  1. Yoshida, Tomoki; Takeuchi, Ichiro; Karasuyama, Masayuki: Safe triplet screening for distance metric learning (2019)
  2. Zhou, Joey Tianyi; Pan, Sinno Jialin; Tsang, Ivor W.: A deep learning framework for hybrid heterogeneous transfer learning (2019)
  3. Adrian Bevan, Thomas Charman, Jonathan Hays: HIPSTER - A python package for particle physics analyses (2018) arXiv
  4. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
  5. Alex A. Alemi, Francois Chollet, Niklas Een, Geoffrey Irving, Christian Szegedy, Josef Urban: DeepMath - Deep Sequence Models for Premise Selection (2018) arXiv
  6. Fischer, Thomas; Krauss, Christopher: Deep learning with long short-term memory networks for financial market predictions (2018)
  7. Forster, Richárd; Fülöp, Agnes: Hierarchical clustering with deep q-learning (2018)
  8. Ghadai, Sambit; Balu, Aditya; Sarkar, Soumik; Krishnamurthy, Adarsh: Learning localized features in 3D CAD models for manufacturability analysis of drilled holes (2018)
  9. Lee, Seunghye; Ha, Jingwan; Zokhirova, Mehriniso; Moon, Hyeonjoon; Lee, Jaehong: Background information of deep learning for structural engineering (2018)
  10. Maximilian Christ, Nils Braun, Julius Neuffer, Andreas W. Kempa-Liehr: Time Series FeatuRe Extraction on basis of Scalable Hypothesis tests (tsfresh - A Python package) (2018) not zbMATH
  11. Michael Schaarschmidt, Sven Mika, Kai Fricke, Eiko Yoneki: RLgraph: Modular Computation Graphs for Deep Reinforcement Learning (2018) arXiv
  12. Pandey, Ram Krishna; Ramakrishnan, A. G.: Efficient document-image super-resolution using convolutional neural network (2018)
  13. Pan, Shaowu; Duraisamy, Karthik: Data-driven discovery of closure models (2018)
  14. Ryan Chard, Zhuozhao Li, Kyle Chard, Logan Ward, Yadu Babuji, Anna Woodard, Steve Tuecke, Ben Blaiszik, Michael J. Franklin, Ian Foster: DLHub: Model and Data Serving for Science (2018) arXiv
  15. Shikhar Bhardwaj, Ryan R. Curtin, Marcus Edel, Yannis Mentekidis, Conrad Sanderson: ensmallen: a flexible C++ library for efficient function optimization (2018) arXiv
  16. Skansi, Sandro: Introduction to deep learning. From logical calculus to artificial intelligence (2018)
  17. Vlachostergiou, Aggeliki; Caridakis, George; Mylonas, Phivos; Stafylopatis, Andreas: Learning representations of natural language texts with generative adversarial networks at document, sentence, and aspect level (2018)
  18. Wu, Hao; Prasad, Saurabh: Semi-supervised deep learning using pseudo labels for hyperspectral image classification (2018)
  19. Zhang, Junbo; Zheng, Yu; Qi, Dekang; Li, Ruiyuan; Yi, Xiuwen; Li, Tianrui: Predicting citywide crowd flows using deep spatio-temporal residual networks (2018)
  20. Zhiting Hu; Haoran Shi; Zichao Yang; Bowen Tan; Tiancheng Zhao; Junxian He; Wentao Wang; Xingjiang Yu; Lianhui Qin; Di Wang; Xuezhe Ma; Hector Liu; Xiaodan Liang; Wanrong Zhu; Devendra Singh Sachan; Eric P. Xing: Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation (2018) arXiv