首页|Feature Selection Based on Improved White Shark Optimizer
Feature Selection Based on Improved White Shark Optimizer
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
万方数据
Feature Selection(FS)is an optimization problem that aims to downscale and improve the quality of a dataset by retaining relevant features while excluding redundant ones.It enhances the classification accuracy of a dataset and holds a crucial position in the field of data mining.Utilizing metaheuristic algorithms for selecting feature subsets contributes to optimiz-ing the FS problem.The White Shark Optimizer(WSO),as a metaheuristic algorithm,primarily simulates the behavior of great white sharks'sense of hearing and smelling during swimming and hunting.However,it fails to consider their other randomly occurring behaviors,for example,Tail Slapping and Clustered Together behaviors.The Tail Slapping behavior can increase population diversity and improve the global search performance of the algorithm.The Clustered Together behavior includes access to food and mating,which can change the direction of local search and enhance local utiliza-tion.It incorporates Tail Slapping and Clustered Together behavior into the original algorithm to propose an Improved White Shark Optimizer(IWSO).The two behaviors and the presented IWSO are tested separately using the CEC2017 benchmark functions,and the test results of IWSO are compared with other metaheuristic algorithms,which proves that IWSO combining the two behaviors has a stronger search capability.Feature selection can be mathematically described as a weighted combination of feature subset size and classification error rate as an optimization model,which is iteratively optimized using discretized IWSO which combines with K-Nearest Neighbor(KNN)on 16 benchmark datasets and the results are compared with 7 metaheuristics.Experimental results show that the IWSO is more capable in selecting feature subsets and improving classification accuracy.