Efficient NAS with FaDE on Hierarchical Spaces
Lecture Notes in Computer Science Advances in Intelligent Data Analysis XXII(2024)
摘要
Neural architecture search (NAS) is a challenging problem. Hierarchical
search spaces allow for cheap evaluations of neural network sub modules to
serve as surrogate for architecture evaluations. Yet, sometimes the hierarchy
is too restrictive or the surrogate fails to generalize. We present FaDE which
uses differentiable architecture search to obtain relative performance
predictions on finite regions of a hierarchical NAS space. The relative nature
of these ranks calls for a memory-less, batch-wise outer search algorithm for
which we use an evolutionary algorithm with pseudo-gradient descent. FaDE is
especially suited on deep hierarchical, respectively multi-cell search spaces,
which it can explore by linear instead of exponential cost and therefore
eliminates the need for a proxy search space.
Our experiments show that firstly, FaDE-ranks on finite regions of the search
space correlate with corresponding architecture performances and secondly, the
ranks can empower a pseudo-gradient evolutionary search on the complete neural
architecture search space.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要