Feature selection using forest optimization algorithm improved by XGBoost
The classical feature selection using forest optimization algorithm(FSFOA)faces issues related to blind initialization and disregarding dimension reduction when dealing with classification tasks.The ensemble learning algorithms can assess feature importance efficiently,so an improved approach using an ensemble learning-inspired initialization strategy with importance measurement is proposed.The new method is also combined with a backward elimination optimal subset selection strategy to create a novel feature selection algorithm called feature selection using forest optimization algorithm improved by xgboost(FSFOAX).Comparative experiments on seven different-dimensional datasets from the UCI database show that FSFOAX outperforms FSFOA.Even compared to recent high-performing wrapper-based feature selection algorithms,FSFOAX remains competitive regarding crucial classification accuracy.It indicates that FSFOAX improves upon FSFOA and is better suited for feature selection tasks.