TernGrad swMATH ID: 22206 Software Authors: Wei Wen, Cong Xu, Feng Yan, Chunpeng Wu, Yandan Wang, Yiran Chen, Hai Li Description: TernGrad: Ternary Gradients to Reduce Communication in Distributed Deep Learning.High network communication cost for synchronizing gradients and parameters is the well-known bottleneck of distributed training. In this work, we propose TernGrad that uses ternary gradients to accelerate distributed deep learning in data parallelism. Our approach requires only three numerical levels {-1,0,1}, which can aggressively reduce the communication time. We mathematically prove the convergence of TernGrad under the assumption of a bound on gradients. Guided by the bound, we propose layer-wise ternarizing and gradient clipping to improve its convergence. Our experiments show that applying TernGrad on AlexNet does not incur any accuracy loss and can even improve accuracy. The accuracy loss of GoogLeNet induced by TernGrad is less than 2 Homepage: https://github.com/wenwei202/terngrad Source Code: https://github.com/wenwei202/terngrad Keywords: Learning; arXiv cs.LG; Distributed; Parallel; and Cluster Computing; arXiv cs.DC; Neural and Evolutionary Computing; arXiv cs.NE; arXiv; Deep Learning; Ternary Gradients Related Software: HOGWILD; TensorFlow; ImageNet; CIFAR; Penn Treebank; GPipe; GNMT; Find; MNIST; Deep Speech; DoReFa-Net; Horovod; SCAFFOLD Cited in: 3 Publications all top 5 Cited by 14 Authors 1 Aksenov, Vitaly 1 Alistarh, Dan 1 Chen, Mengqiang 1 Faghri, Fartash 1 Guo, Binbin 1 Karimireddy, Sai Praneeth 1 Kuang, Di 1 Markov, Ilya 1 Mei, Yuan 1 Ramezani-Kebrya, Ali 1 Roy, Daniel M. 1 Stich, Sebastian U. 1 Wu, Weigang 1 Xiao, Danyang Cited in 2 Serials 2 Journal of Machine Learning Research (JMLR) 1 Information Sciences Cited in 1 Field 3 Computer science (68-XX) Citations by Year