A Three-Level Recursive Differential Grouping Method For Large-Scale Continuous Optimization

Hong-Bin Xu,Fei Li,Hao Shen

IEEE ACCESS(2020)

引用 6|浏览18
暂无评分
摘要
Cooperative co-evolution (CC) is widely used to solve large-scale continuous optimization problems, which divides a large-scale problem into several small-scale sub-problems via decomposition methods and then optimizes each sub-problem separately. However, the performance of CC mainly depends on the decomposition methods. A recently proposed bisection-based decomposition method, called recursive differential grouping (RDG), shows good performance when solving large-scale continuous optimization problems. In order to further improve the performance of RDG, this paper develops a novel decomposition method, called three-level recursive differential grouping (TRDG). In TRDG, when the interaction between two sets is detected, the variables in one of the sets are divided into three subsets based on the trichotomy method, and then the interaction between each subset and the other set is detected. Compared with RDG, TRDG can reduce the depth of recursion, thus saving the number of fitness evaluations (FEs). In addition, we devise a novel strategy to update adaptively the threshold for identifying the interactions between variables. The simulation experiment results on CEC'2010 and CEC'2013 benchmark functions show that the performance of TRDG is better than several existing decomposition methods in terms of the accuracy and the number of FEs. Furthermore, TRDG is embedded into two frameworks to tackle CEC'2010 large-scale continuous optimization problems.
更多
查看译文
关键词
Large-scale continuous optimization, cooperative co-evolution (CC), differential grouping, trichotomy method
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要