×

TFad

swMATH ID: 7478
Software Authors: Gay, David M.
Description: Semiautomatic differentiation for efficient gradient computations Many large-scale computations involve a mesh and first (or sometimes higher) partial derivatives of functions of mesh elements. In principle, automatic differentiation (AD) can provide the requisite partials more efficiently and accurately than conventional finite-difference approximations. AD requires source-code modifications, which may be little more than changes to declarations. Such simple changes can easily give improved results, e.g., when Jacobian-vector products are used iteratively to solve nonlinear equations. When gradients are required (say, for optimization) and the problem involves many variables, “backward AD” in theory is very efficient, but when carried out automatically and straightforwardly, may use a prohibitive amount of memory. In this case, applying AD separately to each element function and manually assembling the gradient pieces – semiautomatic differentiation – can deliver gradients efficiently and accurately. This paper concerns on-going work; it compares several implementations of backward AD, describes a simple operator-overloading implementation specialized for gradient computations, and compares the implementations on some mesh-optimization examples. Ideas from the specialized implementation could be used in fully general source-to-source translators for C and C++.
Homepage: http://rd.springer.com/chapter/10.1007/3-540-28438-9_13
Keywords: semiautomatic differentiation; mesh elements; manual assembly; Jacobian-vector products; C/C++ source-to-source; TFad
Related Software: ADOL-C; ADIFOR; CppAD; TAF; L-BFGS; Trilinos; Sacado; FADBAD++; Matlab; TAPENADE; ADIC; DGM; TensorFlow; FPINNs; DeepXDE; DiffSharp; SU2; FLUENT; Peridigm; ABAQUS
Cited in: 12 Documents

Citations by Year