×

Hyperband

swMATH ID: 41120
Software Authors: L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, A. Talwalkar
Description: Hyperband: a novel bandit-based approach to hyperparameter optimization. Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation and early-stopping. We formulate hyperparameter optimization as a pure-exploration nonstochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce a novel algorithm, Hyperband, for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare Hyperband with popular Bayesian optimization methods on a suite of hyperparameter optimization problems. We observe that Hyperband can provide over an order-of-magnitude speedup over our competitor set on a variety of deep-learning and kernel-based learning problems.
Homepage: https://arxiv.org/abs/1603.06560
Source Code:  https://github.com/thuijskens/scikit-hyperband
Related Software: Spearmint; SMAC; Hyperopt; BOHB; Adam; GitHub; Scikit; PyTorch; BoTorch; AlexNet; ImageNet; auto-sklearn; TensorFlow; NOMAD; R; CIFAR; MNIST; RMSprop; Python; RoBO
Cited in: 21 Documents

Citations by Year