GELUs swMATH ID: 36443 Software Authors: Dan Hendrycks, Kevin Gimpel Description: Gaussian Error Linear Units (GELUs). We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function. The GELU activation function is xΦ(x), where Φ(x) the standard Gaussian cumulative distribution function. The GELU nonlinearity weights inputs by their value, rather than gates inputs by their sign as in ReLUs (x1x>0). We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all considered computer vision, natural language processing, and speech tasks. Homepage: https://arxiv.org/abs/1606.08415 Source Code: https://github.com/hendrycks/GELUs Keywords: Machine Learning; arXiv_cs.LG; Gaussian Error Linear Unit; GELU; neural network; activation function Related Software: Adam; PyTorch; ABAQUS; COMSOL; SGDR; Zoneout; DP-GEN; PySCF; DeePKS-kit; XNOR-Net; AlexNet; Grad-CAM; Faster R-CNN; Fashion-MNIST; t-SNE; ImageNet; MNIST; Scikit; UCI-ml; DLMF Cited in: 4 Publications Standard Articles 1 Publication describing the Software Year Gaussian Error Linear Units (GELUs) Dan Hendrycks, Kevin Gimpel 2016 all top 5 Cited by 11 Authors 1 Bastek, Jan-Hendrik 1 Doshi-Velez, Finale 1 E, Weinan 1 Hughes, Michael C. 1 Hyndman, Cody Blaine 1 Kochmann, Dennis M. 1 Kratsios, Anastasis 1 Ma, Chao 1 Parbhoo, Sonali 1 Roth, Volker 1 Wu, Mike Cited in 4 Serials 1 The Journal of Artificial Intelligence Research (JAIR) 1 European Journal of Mechanics. A. Solids 1 Journal of Machine Learning Research (JMLR) 1 Science China. Mathematics Cited in 3 Fields 3 Computer science (68-XX) 1 Calculus of variations and optimal control; optimization (49-XX) 1 Mechanics of deformable solids (74-XX) Citations by Year