ATL-DAS: Automatic Topology Learning for Differentiable Architecture Search

Displays(2023)

引用 0|浏览5
暂无评分
摘要
Differentiable architecture search (DARTS) provides a fast solution to Neural Architecture Search (NAS) by using gradient descent over a continuous search space. Despite its high search speed, DARTS suffers from several problems such as ambiguous topology selection, operation co-adaptation, incomplete NAS pipeline, and large memory consumption. To address these problems, we first introduce topology parameters into the search space to explicitly model the network topology, which ensures the searched network architecture is well-defined. Next, we propose two sampling strategies to sample independent child networks for training and evaluation, which solve the co-adaptation problem while making the NAS pipeline complete. Finally, we use hard pruning to avoid invalid computations, which greatly reduces memory consumption. The proposed Automatic Topology Learning for Differentiable Architecture Search (ATL-DAS) algorithm performs favorably against the state-of-the-art approaches on CIFAR10 and CIFAR100 with error rates of 2.49% and 16.8%, respectively. Moreover, the architectures searched perform well across various visual tasks, including ImageNet classification, COCO object detection, and Composition-1K image matting, highlighting the potential of ATL-DAS for real-world applications.
更多
查看译文
关键词
Differentiable architecture search, Neural architecture search, Regularization, Network pruning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要