Forward-Mode Automatic Differentiation in Julia. We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support for higher-order differentiation and differentiation using custom number types (including complex numbers). For gradient and Jacobian calculations, ForwardDiff provides a variant of vector-forward mode that avoids expensive heap allocation and makes better use of memory bandwidth than traditional vector mode. In our numerical experiments, we demonstrate that for nontrivially large dimensions, ForwardDiff’s gradient computations can be faster than a reverse-mode implementation from the Python-based autograd package. We also illustrate how ForwardDiff is used effectively within JuMP, a modeling language for optimization. According to our usage statistics, 41 unique repositories on GitHub depend on ForwardDiff, with users from diverse fields such as astronomy, optimization, finite element analysis, and statistics. This document is an extended abstract that has been accepted for presentation at the AD2016 7th International Conference on Algorithmic Differentiation.
Keywords for this software
References in zbMATH (referenced in 10 articles , 1 standard article )
Showing results 1 to 10 of 10.
- Cancès, Clément; Hillairet, Claire Chainais; Fuhrmann, Jürgen; Gaudeul, Benoît: On four numerical schemes for a unipolar degenerate drift-diffusion model (2020)
- Milz, Johannes; Ulbrich, Michael: An approximation scheme for distributionally robust nonlinear optimization (2020)
- Orban, Dominique; Siqueira, Abel Soares: A regularization method for constrained nonlinear least squares (2020)
- Emerson V. Castelani; Ronaldo Lopes; Wesley V. I. Shirabayashi; Francisco N. C. Sobral: RAFF.jl: Robust Algebraic Fitting Function in Julia (2019) not zbMATH
- Tim Besard, Valentin Churavy, Alan Edelman, Bjorn De Sutter: Rapid software prototyping for heterogeneous and distributed platforms (2019) not zbMATH
- Wormell, Caroline: Spectral Galerkin methods for transfer operators in uniformly expanding dynamics (2019)
- Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
- Schmitt, Jeremy; Shingel, Tatiana; Leok, Melvin: Lagrangian and Hamiltonian Taylor variational integrators (2018)
- Dunning, Iain; Huchette, Joey; Lubin, Miles: JuMP: a modeling language for mathematical optimization (2017)
- Jarrett Revels, Miles Lubin, Theodore Papamarkou: Forward-Mode Automatic Differentiation in Julia (2016) arXiv