Operation And Topology Aware Fast Differentiable Architecture Search

2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR)(2020)

引用 1|浏览9
暂无评分
摘要
Differentiable architecture search (DARTS) has gained significant attention amongst neural architecture search approaches due to its effectiveness in finding competitive network architectures with affordable computational complexity. However, DARTS' search space is designed such that even a randomly sampled architecture performs reasonably well. Moreover, due to the complexity of search architectural building block or cell, it is unclear whether these are certain operations or the cell topology that contributes most to achieving higher final accuracy. In this work, we dissect the DARTS's search space to understand which components are most effective in producing better architectures. Our experiments show that: (1) Good architectures can be discovered regardless of the search network depth; (2) Seperable convolution with 3x3 kernel is the most effective operation in this search space; and (3) The cell topology also has substantial effect on the accuracy. Based on these insights, we propose an efficient search approach referred to as eDARTS, which searches on a pre-specified cell having good topology with increased attention to important operations, using a shallow search supernet. Moreover, we propose some optimizations for eDARTS that significantly speed up the search as well as alleviate the well known skip connection aggregation problem of DARTS. eDARTS achieves an error rate of 2.53% on CIFAR-10 using a 3.1M parameters model whereas the search cost is less than 30 minutes.
更多
查看译文
关键词
differentiable architecture search,neural architecture search,competitive network architectures,computational complexity,search architectural building block,cell topology,DARTS,search network depth,search approach,randomly sampled architecture,time 30.0 min
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要