Parameter-less Pareto local search for multi-objective neural architecture search with the Interleaved Multi-start Scheme

Swarm and Evolutionary Computation(2024)

引用 0|浏览14
暂无评分
摘要
With the emerging deployment of deep neural networks, such as in mobile devices and autonomous cars, there is a growing demand for neural architecture search (NAS) to automatically design powerful network architectures. It is more reasonable to formulate NAS as a multi-objective optimization problem. In addition to prediction performance, multi-objective NAS (MONAS) problems take into account other criteria like the number of parameters and inference latency. Multi-objective evolutionary algorithms (MOEAs) are the preferred approach for tackling MONAS due to their effectiveness in dealing with multi-objective optimization problems. Recently, local search-based NAS algorithms have demonstrated their efficiency over MOEAs for MONAS problems. However, their performance has been only verified on bi-objective NAS problems. In this article, we propose a local search algorithm for multi-objective NAS (LOMONAS), an efficient local search framework for solving not only bi-objective NAS problems but also NAS problems having more than two objectives. We additionally present a parameter-less version of LOMONAS, namely IMS-LOMONAS, by combining LOMONAS with the Interleaved Multi-start Scheme (IMS) to help NAS practitioners avoid manual control parameter settings. Experimental results from a series of benchmark problems in the CEC’23 Competition demonstrate the competitiveness of LOMONAS and IMS-LOMONAS compared to MOEAs in tackling MONAS within both small-scale and large-scale search spaces. Source code is available at: https://github.com/ELO-Lab/IMS-LOMONAS.
更多
查看译文
关键词
Neural architecture search,AutoML,Multi-objective optimization,Pareto local search,Parameter-less algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要