×

SGDR

swMATH ID: 30752
Software Authors: Ilya Loshchilov, Frank Hutter
Description: SGDR: Stochastic Gradient Descent with Warm Restarts. Restart techniques are common in gradient-free optimization to deal with multimodal functions. Partial warm restarts are also gaining popularity in gradient-based optimization to improve the rate of convergence in accelerated gradient schemes to deal with ill-conditioned functions. In this paper, we propose a simple warm restart technique for stochastic gradient descent to improve its anytime performance when training deep neural networks. We empirically study its performance on the CIFAR-10 and CIFAR-100 datasets, where we demonstrate new state-of-the-art results at 3.14
Homepage: https://arxiv.org/abs/1608.03983
Source Code:  https://github.com/loshchil/SGDR
Related Software: Adam; PyTorch; ImageNet; CIFAR; Python; MixMatch; mixup; RMSprop; RandAugment; ReMixMatch; FixMatch; MobileNetV2; DARTS; EfficientNet; TensorFlow; DeepONet; EnKF; SeqGAN; ADADELTA; GitHub
Cited in: 16 Documents

Citations by Year