Graph is all you need? Lightweight data-agnostic neural architecture search without training
arxiv(2024)
摘要
Neural architecture search (NAS) enables the automatic design of neural
network models. However, training the candidates generated by the search
algorithm for performance evaluation incurs considerable computational
overhead. Our method, dubbed nasgraph, remarkably reduces the computational
costs by converting neural architectures to graphs and using the average
degree, a graph measure, as the proxy in lieu of the evaluation metric. Our
training-free NAS method is data-agnostic and light-weight. It can find the
best architecture among 200 randomly sampled architectures from NAS-Bench201 in
217 CPU seconds. Besides, our method is able to achieve competitive performance
on various datasets including NASBench-101, NASBench-201, and NDS search
spaces. We also demonstrate that nasgraph generalizes to more challenging tasks
on Micro TransNAS-Bench-101.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要