swMATH ID: 33648
Software Authors: Haifeng Jin, Qingquan Song, Xia Hu
Description: AutoKeras, Auto-Keras: An Efficient Neural Architecture Search System. Neural architecture search (NAS) has been proposed to automatically tune deep neural networks, but existing search algorithms, e.g., NASNet, PNAS, usually suffer from expensive computational cost. Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search. The framework develops a neural network kernel and a tree-structured acquisition function optimization algorithm to efficiently explores the search space. Intensive experiments on real-world benchmark datasets have been done to demonstrate the superior performance of the developed framework over the state-of-the-art methods. Moreover, we build an open-source AutoML system based on our method, namely Auto-Keras. The system runs in parallel on CPU and GPU, with an adaptive search strategy for different GPU memory limits.
Homepage: https://autokeras.com
Source Code:  https://github.com/keras-team/autokeras
Keywords: Machine Learning; arXiv_cs.LG; arXiv_Artificial Intelligence; cs.AI; arXiv_stat.ML; NAS; Automated Machine Learning; AutoML; Neural Architecture Search; Bayesian Optimization; Network Morphism
Related Software: Auto-WEKA; Python; auto-sklearn; TensorFlow; H2O; GitHub; PyTorch; DARTS; Hyperopt; Spearmint; Jupyter; JavaScript; sktime; Wave2Vec; mcfly; ProxylessNAS; SMAC; TPOT; Scikit; EGO
Cited in: 5 Documents

Standard Articles

1 Publication describing the Software Year
Auto-Keras: An Efficient Neural Architecture Search System arXiv
Haifeng Jin, Qingquan Song, Xia Hu

Citations by Year