A binary hybrid sine cosine white shark optimizer for feature selection

Cluster Computing(2024)

引用 0|浏览4
暂无评分
摘要
Feature Selection (FS), a pre-processing step used in the majority of big data processing applications, aims to eliminate irrelevant and redundant features from the data. Its purpose is to select a final set of data characteristics that best represent the data as a whole. To achieve this, it explores every potential solution in order to identify the optimal subset. Meta-heuristic algorithms have been found to be particularly effective in solving FS problems, especially for high-dimensional datasets. This work adopts a recently developed meta-heuristic called the White Shark Optimizer (WSO) due to its simplicity and low computational overhead. However, WSO faces challenges in effectively balancing exploration and exploitation, particularly in complex multi-peak search problems. It tends to converge prematurely and get stuck in local optima, which can lead to poor search performance when dealing with FS problems. To overcome these issues, this paper presents three enhanced binary variants of WSO for well-known FS problems. These variants are as follows: (1) Binary Distribution-based WSO (BDWSO), where the algorithm refines the positions of white sharks by considering both the average and standard deviation of the current shark, the local best shark, and the global best shark. This strategy is designed to alleviate issues of premature convergence and stagnation during iterations , (2) Binary Sine Cosine WSO (BSCWSO), which uses sine and cosine adaptive functions for the social and cognitive components of the position update rule, and (3) Binary Hybrid Sine Cosine WSO (BHSCWSO), which employs sine and cosine acceleration factors to regulate local search and achieve convergence to the global optimal solution. Additionally, the population was initialized using the Opposition-Based Learning (OBL) mechanism, and the sine map was used to modify the inertia weight of WSO. These revised variants of WSO were established to have a better harmony between the exploration and exploitation facets. The proposed methods were extensively compared to the fundamental binary WSO and other well-known algorithms in the field. The experimental findings and comparisons demonstrate that the proposed methods outperform the conventional and most evaluated similar algorithms in terms of robustness and solution quality. In terms of classification accuracy, number of selected features, specificity, sensitivity, and fitness values, the proposed BHSCWSO optimizer performed better than all other proposed peer optimizers, including BSWO, BDWSO, and BSCWSO, in 11, 8, 13, 18, and 10 datasets, respectively. The proposed BHSCWSO optimizer showed performance levels of more than 90
更多
查看译文
关键词
Binary white shark optimizer,Feature selection,Classification,Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要