Keras: Deep Learning library for Theano and TensorFlow. Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Use Keras if you need a deep learning library that: allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation at Keras is compatible with: Python 2.7-3.5.

References in zbMATH (referenced in 89 articles )

Showing results 21 to 40 of 89.
Sorted by year (citations)
  1. P.E. Hadjidoukas, A. Bartezzaghi, F. Scheidegger, R. Istrate, C.Bekas, A.C.I. Malossi: torcpy: Supporting task parallelism in Python (2020) not zbMATH
  2. Ruehle, Fabian: Data science applications to string theory (2020)
  3. Shahriari, M.; Pardo, D.; Picon, A.; Galdran, A.; Del Ser, J.; Torres-Verdín, C.: A deep learning approach to the inversion of borehole resistivity measurements (2020)
  4. Szymon Maksymiuk, Alicja Gosiewska, Przemyslaw Biecek: Landscape of R packages for eXplainable Artificial Intelligence (2020) arXiv
  5. Vanessa Sochat: GridTest: testing and metrics collection for Python (2020) not zbMATH
  6. Vasilyeva, Maria; Leung, Wing T.; Chung, Eric T.; Efendiev, Yalchin; Wheeler, Mary: Learning macroscopic parameters in nonlinear multiscale simulations using nonlocal multicontinua upscaling techniques (2020)
  7. Wang, Qian; Ripamonti, Nicolò; Hesthaven, Jan S.: Recurrent neural network closure of parametric POD-Galerkin reduced-order models based on the Mori-Zwanzig formalism (2020)
  8. Willmott, Devin; Murrugarra, David; Ye, Qiang: Improving RNA secondary structure prediction via state inference with deep recurrent neural networks (2020)
  9. Ariafar, Setareh; Coll-Font, Jaume; Brooks, Dana; Dy, Jennifer: ADMMBO: Bayesian optimization with unknown constraints using ADMM (2019)
  10. Arridge, Simon; Maass, Peter; Öktem, Ozan; Schönlieb, Carola-Bibiane: Solving inverse problems using data-driven models (2019)
  11. Balakrishnan, Harikrishnan Nellippallil; Kathpalia, Aditi; Saha, Snehanshu; Nagaraj, Nithin: Chaosnet: a chaos based artificial neural network architecture for classification (2019)
  12. Barreto, Carlos; Koutsoukos, Xenofon: Design of load forecast systems resilient against cyber-attacks (2019)
  13. Bugbee, Bruce; Bush, Brian W.; Gruchalla, Kenny; Potter, Kristin; Brunhart-lupo, Nicholas; Krishnan, Venkat: Enabling immersive engagement in energy system models with deep learning (2019)
  14. Cruz, Matheus A.; Thompson, Roney L.; Sampaio, Luiz E. B.; Bacchi, Raphael D. A.: The use of the Reynolds force vector in a physics informed machine learning approach for predictive turbulence modeling (2019)
  15. Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viegas, Martin Wattenberg: TensorFlow.js: Machine Learning for the Web and Beyond (2019) arXiv
  16. Fan, Yuwei; Feliu-Fabà, Jordi; Lin, Lin; Ying, Lexing; Zepeda-Núñez, Leonardo: A multiscale neural network based on hierarchical nested bases (2019)
  17. Fan, Yuwei; Lin, Lin; Ying, Lexing; Zepeda-Núñez, Leonardo: A multiscale neural network based on hierarchical matrices (2019)
  18. Ghatak, Abhijit: Deep learning with R (2019)
  19. Gohr, Aron: Improving attacks on round-reduced Speck32/64 using deep learning (2019)
  20. Herzog, S.; Wörgötter, F.; Parlitz, U.: Convolutional autoencoder and conditional random fields hybrid for predicting spatial-temporal chaos (2019)