swMATH ID: 36242
Software Authors: Ahmed T. Elthakeb, Prannoy Pilligundla, Hadi Esmaeilzadeh
Description: SinReQ: Generalized Sinusoidal Regularization for Low-Bitwidth Deep Quantized Training. Deep quantization of neural networks (below eight bits) offers significant promise in reducing their compute and storage cost. Albeit alluring, without special techniques for training and optimization, deep quantization results in significant accuracy loss. To further mitigate this loss, we propose a novel sinusoidal regularization, called SinReQ1, for deep quantized training. SinReQ adds a periodic term to the original objective function of the underlying training algorithm. SinReQ exploits the periodicity, differentiability, and the desired convexity profile in sinusoidal functions to automatically propel weights towards values that are inherently closer to quantization levels. Since, this technique does not require invasive changes to the training procedure, SinReQ can harmoniously enhance quantized training algorithms. SinReQ offers generality and flexibility as it is not limited to a certain bitwidth or a uniform assignment of bitwidths across layers. We carry out experimentation using the AlexNet, CIFAR-10, ResNet-18, ResNet-20, SVHN, and VGG-11 DNNs with three to five bits for quantization and show the versatility of SinReQ in enhancing multiple quantized training algorithms, DoReFa [32] and WRPN [24]. Averaging across all the bit configurations shows that SinReQ closes the accuracy gap between these two techniques and the full-precision runs by 32.4
Homepage: https://arxiv.org/abs/1905.01416
Keywords: Machine Learning; arXiv_cs.LG; arXiv_stat.ML
Related Software: DoReFa-Net; Smallify; gemmlowp; ONNX; Torchvision; FAKTA; AMC; BranchyNet; TensorFlow; MXNet; PocketFlow; OpenVino; TensorRT; PyTorch; Python; distiller
Cited in: 0 Documents