Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory.

References in zbMATH (referenced in 16 articles )

Showing results 1 to 16 of 16.
Sorted by year (citations)

  1. Kamm, Jack; Terhorst, Jonathan; Durbin, Richard; Song, Yun S.: Efficiently inferring the demographic history of many populations with allele count data (2020)
  2. Katrutsa, Alexandr; Daulbaev, Talgat; Oseledets, Ivan: Black-box learning of multigrid parameters (2020)
  3. R. Adhikari, Austen Bolitho, Fernando Caballero, Michael E. Cates, Jakub Dolezal, Timothy Ekeh, Jules Guioth, Robert L. Jack, Julian Kappler, Lukas Kikuchi, Hideki Kobayashi, Yuting I. Li, Joseph D. Peterson, Patrick Pietzonka, Benjamin Remez, Paul B. Rohrbach, Rajesh Singh, Günther Turk: Inference, prediction and optimization of non-pharmaceutical interventions using compartment models: the PyRoss library (2020) arXiv
  4. Blanchard, Antoine; Sapsis, Themistoklis P.: Learning the tangent space of dynamical instabilities from data (2019)
  5. Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viegas, Martin Wattenberg: TensorFlow.js: Machine Learning for the Web and Beyond (2019) arXiv
  6. Ghosh, Soumya; Yao, Jiayu; Doshi-Velez, Finale: Model selection in Bayesian neural networks via horseshoe priors (2019)
  7. Masood, Muhammad A.; Doshi-Velez, Finale: A particle-based variational approach to Bayesian non-negative matrix factorization (2019)
  8. Oates, Chris J.; Cockayne, Jon; Briol, François-Xavier; Girolami, Mark: Convergence rates for a class of estimators based on Stein’s method (2019)
  9. Dan Moldovan, James M Decker, Fei Wang, Andrew A Johnson, Brian K Lee, Zachary Nado, D Sculley, Tiark Rompf, Alexander B Wiltschko: AutoGraph: Imperative-style Coding with Graph-based Performance (2018) arXiv
  10. Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
  11. Giraldi, Loïc; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.: Optimal projection of observations in a Bayesian setting (2018)
  12. Shikhar Bhardwaj, Ryan R. Curtin, Marcus Edel, Yannis Mentekidis, Conrad Sanderson: ensmallen: a flexible C++ library for efficient function optimization (2018) arXiv
  13. Srajer, Filip; Kukelova, Zuzana; Fitzgibbon, Andrew: A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning (2018)
  14. Bart van Merrienboer, Alexander B. Wiltschko, Dan Moldovan: Tangent: Automatic Differentiation Using Source Code Transformation in Python (2017) arXiv
  15. Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.: Automatic differentiation variational inference (2017)
  16. Jarrett Revels, Miles Lubin, Theodore Papamarkou: Forward-Mode Automatic Differentiation in Julia (2016) arXiv