swMATH ID: 22077
Software Authors: Dougal Maclaurin; David Duvenaud; Matt Johnson
Description: Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory.
Homepage: https://github.com/HIPS/autograd
Source Code: https://github.com/HIPS/autograd
Related Software: TensorFlow; PyTorch; Theano; Python; SciPy; NumPy; GitHub; JAX; DiffSharp; Scikit; Matlab; UCI-ml; Keras; Stan; Lantern; Julia; Numba; Tangent; Adam; Octave
Cited in: 17 Publications

Citations by Year