Improved Golden Jackal Optimization Algorithm for Solving Function Optimization and Feature Selection
The basic Golden Jackal Optimization algorithm(GJO)had several drawbacks such as low computation precision,poor exploitation,and ease to get stuck in a local optima when solving high-dimensional optimization problems.An improved GJO algorithm(I-GJO)was proposed.In I-GJO,the original randomly decreasing energy factor was replaced by a nonlinear decreasing factor based on sine function to balance the global exploration and local exploitation abilities of algorithm during the search process.In the later iterative stage of algorithm,a somersault learning strategy was introduced to expand the population search region and improved the solution precision.In order to verify the effectiveness of the proposed I-GJO algorithm,six benchmark function optimization problems were selected for experiment.The experimental results indicated that I-GJO had higher precision and faster convergence speed than the Grey Wolf optimizer(GWO),Seagull Optimization Algorithm(SOA)and the basic GJO algorithm.Finally,I-GJO was applied to solve the feature selection problem.The numerical results on sixteen benchmark datasets showed that I-GJO could effectively remove the redundant features and improve the classification accuracy.
Golden Jackal Optimization algorithmsomerault learning strategyfunction optimizationfeature selection