Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory.
Keywords for this software
References in zbMATH (referenced in 6 articles )
Showing results 1 to 6 of 6.
- Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viegas, Martin Wattenberg: TensorFlow.js: Machine Learning for the Web and Beyond (2019) arXiv
- Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
- Giraldi, Loïc; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.: Optimal projection of observations in a Bayesian setting (2018)
- Srajer, Filip; Kukelova, Zuzana; Fitzgibbon, Andrew: A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning (2018)
- Bart van Merrienboer, Alexander B. Wiltschko, Dan Moldovan: Tangent: Automatic Differentiation Using Source Code Transformation in Python (2017) arXiv
- Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.: Automatic differentiation variational inference (2017)