Evaluation of Hyperparameter Optimization Techniques for Traditional Machine Learning Models
Reasonable hyperparameters ensure that machine learning models can adapt to different backgrounds and tasks.In or-der to avoid the inefficiency caused by manual adjustment of a large number of model hyperparameters and a vast search space,various hyperparameter optimization techniques have been developed and applied in machine learning model training.At first,Pa-per reviews eight common hyperparameter optimization techniques:grid search,random search,Bayesian optimization,Hyper-band,Bayesian optimization and Hyperband(BOHB),genetic algorithms,particle swarm optimization algorithm,and covariance matrix adaptation evolutionary strategy(CMA-ES).The advantages and disadvantages of these methods are analyzed from five as-pects:time performance,final results,parallel capability,scalability,robustness and flexibility.Subsequently,these eight methods are applied to four traditional machine learning models:LightGBM,XGBoost,Random Forest,and K-Nearest Neighbors(KNN).Regression,binary classification and multi-classification experiments are performed on four standard datasets:Boston house price dataset,kin8nm power arm dataset,credit card default customer dataset and handwritten digit dataset.Different methods are com-pared by evaluating their performance using output evaluation metrics.Finally,pros and cons of each method and are summarized,and the application scenarios of different methods are given.The results highlight the importance of selecting appropriate hyper-parameter optimization methods to enhance the efficiency and effectiveness of machine learning model training.
Traditional machine learningHyperparameter optimizationBayesian optimizationMulti-fidelity technologyMeta-heuristic algorithms