Neural Architecture Search via Bayesian Optimization with a Neural Network Model

Colin White, Willie Neiswanger,Yash Savani

semanticscholar(2019)

引用 0|浏览4
暂无评分
摘要
Neural Architecture Search (NAS) has seen an explosion of research in the past three years. A variety of methods have been proposed to perform NAS, including reinforcement learning, Bayesian optimization with a Gaussian process prior, evolutionary search, and gradient descent. In this work, we design a NAS algorithm which uses Bayesian optimization with a neural network prediction model. We use a path-based encoding scheme to featurize the neural architectures that are used as training data for the meta neural network. This method is particularly effective for encoding architectures in cell-based search spaces. After training on just 150 random neural architectures, we are able to predict the validation accuracy of an architecture to within one percent of its true accuracy on average. This may be of independent interest beyond Bayesian neural architecture search. We test our algorithm on the NAS-Bench-101 dataset [Ying et al. 2019], and we show that our algorithm significantly outperforms baselines including evolutionary search and reinforcement learning. We try several acquisition functions and show that the upper confidence bound function works the best. Finally, we show our path-based encoding scheme significantly improves the performance of the NAS algorithm compared to other encoding methods. 1
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要