Autograd

Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory.


References in zbMATH (referenced in 22 articles )

Showing results 1 to 20 of 22.
Sorted by year (citations)

1 2 next

  1. Derryn Knife: SurPyval: Survival Analysis with Python (2021) not zbMATH
  2. Vlassis, Nikolaos N.; Sun, WaiChing: Sobolev training of thermodynamic-informed neural networks for interpretable elasto-plasticity models with level set hardening (2021)
  3. Julian Blank, Kalyanmoy Deb: pymoo: Multi-objective Optimization in Python (2020) arXiv
  4. Kamm, Jack; Terhorst, Jonathan; Durbin, Richard; Song, Yun S.: Efficiently inferring the demographic history of many populations with allele count data (2020)
  5. Katrutsa, Alexandr; Daulbaev, Talgat; Oseledets, Ivan: Black-box learning of multigrid parameters (2020)
  6. Laue, Sören; Mitterreiter, Matthias; Giesen, Joachim: A simple and efficient tensor calculus for machine learning (2020)
  7. Lee, Jaehoon; Xiao, Lechao; Schoenholz, Samuel S.; Bahri, Yasaman; Novak, Roman; Sohl-Dickstein, Jascha; Pennington, Jeffrey: Wide neural networks of any depth evolve as linear models under gradient descent (2020)
  8. R. Adhikari, Austen Bolitho, Fernando Caballero, Michael E. Cates, Jakub Dolezal, Timothy Ekeh, Jules Guioth, Robert L. Jack, Julian Kappler, Lukas Kikuchi, Hideki Kobayashi, Yuting I. Li, Joseph D. Peterson, Patrick Pietzonka, Benjamin Remez, Paul B. Rohrbach, Rajesh Singh, Günther Turk: Inference, prediction and optimization of non-pharmaceutical interventions using compartment models: the PyRoss library (2020) arXiv
  9. Blanchard, Antoine; Sapsis, Themistoklis P.: Learning the tangent space of dynamical instabilities from data (2019)
  10. Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viegas, Martin Wattenberg: TensorFlow.js: Machine Learning for the Web and Beyond (2019) arXiv
  11. Ghosh, Soumya; Yao, Jiayu; Doshi-Velez, Finale: Model selection in Bayesian neural networks via horseshoe priors (2019)
  12. Masood, Muhammad A.; Doshi-Velez, Finale: A particle-based variational approach to Bayesian non-negative matrix factorization (2019)
  13. Oates, Chris J.; Cockayne, Jon; Briol, François-Xavier; Girolami, Mark: Convergence rates for a class of estimators based on Stein’s method (2019)
  14. Dan Moldovan, James M Decker, Fei Wang, Andrew A Johnson, Brian K Lee, Zachary Nado, D Sculley, Tiark Rompf, Alexander B Wiltschko: AutoGraph: Imperative-style Coding with Graph-based Performance (2018) arXiv
  15. Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
  16. Giraldi, Loïc; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.: Optimal projection of observations in a Bayesian setting (2018)
  17. Shikhar Bhardwaj, Ryan R. Curtin, Marcus Edel, Yannis Mentekidis, Conrad Sanderson: ensmallen: a flexible C++ library for efficient function optimization (2018) arXiv
  18. Srajer, Filip; Kukelova, Zuzana; Fitzgibbon, Andrew: A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning (2018)
  19. Ville Bergholm, Josh Izaac, Maria Schuld, Christian Gogolin, M. Sohaib Alam, Shahnawaz Ahmed, Juan Miguel Arrazola, Carsten Blank, Alain Delgado, Soran Jahangiri, Keri McKiernan, Johannes Jakob Meyer, Zeyue Niu, Antal Száva, Nathan Killoran: PennyLane: Automatic differentiation of hybrid quantum-classical computations (2018) arXiv
  20. Bart van Merrienboer, Alexander B. Wiltschko, Dan Moldovan: Tangent: Automatic Differentiation Using Source Code Transformation in Python (2017) arXiv

1 2 next