Local Search is State of the Art for Neural Architecture Search Benchmarks

arXiv (Cornell University)(2020)

引用 0|浏览0
暂无评分
摘要
Local search is one of the simplest families of algorithms in combinatorial optimization, yet it yields strong approximation guarantees for canonical NP-Complete problems such as the traveling salesman problem and vertex cover. While it is a ubiquitous algorithm in theoretical computer science, local search is often neglected in hyperparameter optimization and neural architecture search. We show that the simplest local search instantiations achieve state-of-the-art results on multiple NAS benchmarks (NASBench-101 and NASBench-201), outperforming the most popular recent NAS algorithms. However, local search fails to perform well on the much larger DARTS search space. Motivated by these observations, we present a theoretical study which characterizes the performance of local search on graph optimization problems, backed by simulation results. This may be of independent interest beyond NAS. All code and materials needed to reproduce our results are publicly available at this https URL.
更多
查看译文
关键词
neural architecture search benchmarks,local search
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要