Surrogate Many Objective Optimization: Combining Evolutionary Search, $$\epsilon $$-Dominance and Connected Restarts

WCGO(2019)

引用 1|浏览11
暂无评分
摘要
Scaling multi-objective optimization (MOO) algorithms to handle many objectives is a significant computational challenge. This challenge exacerbates when the underlying objectives are computationally expensive, and solutions are desired within a limited number of expensive objective evaluations. A surrogate model-based optimization framework can be effective in MOO. However, most prior model-based algorithms are effective for 2–3 objectives. This study investigates the combined use of \(\epsilon \)-dominance, connected restarts and evolutionary search for efficient Many-objective optimization (MaOO). We built upon an existing surrogate-based evolutionary algorithm, GOMORS, and propose \(\epsilon \)-GOMORS, i.e., a surrogate-based iterative evolutionary algorithm that combines Radial Basis Functions and \(\epsilon \)-dominance-based evolutionary search, to propose new points for expensive evaluations in each algorithm iteration. Moreover, a novel connected restart mechanism is introduced to ensure that the optimization search does not get stuck in locally optimum fronts. \(\epsilon \)-GOMORS is applied to a few benchmark multi-objective problems and a watershed calibration problem, and compared against GOMORS, ParEGO, NSGA-III, Borg, \(\epsilon \)-NSGA-II and MOEA/D on a limited budget of 1000 evaluations. Results indicate that \(\epsilon \)-GOMORS converges more quickly than other algorithms and the variance of its performance across multiple trials, is also less than other algorithms.
更多
查看译文
关键词
Expensive optimization,Many objectives,Meta-models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要