Skip to Content

An efficient global optimization algorithm for tuning hyperparameters of deep neural networks

Matthew West, University of Illinois at Urbana-Champaign

Usage Details

Matthew West, Chenchao Shou

Hyperparameters are crucial to the performance of a machine learning algorithm. The difference between poor and good hyperparameters can mean the difference between a useless model and state-of-the-art performance. In recent years, there is growing interest in developing optimization algorithms for automatic tuning of hyperparameters to replace traditional grid-based methods. To this end, we developed a new global optimization algorithm, namely Adaptive Stochastic Response Surface (AdaSRS). Preliminary experiments on 32 optimization benchmark problems show that our AdaSRS algorithm achieves the best optimization performance among 7 state-of-the-art optimization algorithms. Moreover, our algorithm is 1-3 orders of magnitude cheaper to run than the others. These encouraging results motivate us to further develop our algorithm for tuning hyperparameters of a deep neural network (DNN), as we propose for this project. The proposed research requires iteratively training DNNs, leading to a tremendous computing scale that only Blue Waters can handle. Furthermore, the fact that DNNs are highly parallelizable makes this project ideal for exploiting CPU/GPU nodes in Blue Waters. Successful development of our algorithm can change the way people tune a DNN or a machine learning algorithm in general, ultimately resulting in less tuning time and yet better quality of the model.