Averaged one-dependence estimators classification algorithm based on divergence weighting
The averaged one-dependence estimators(AODE)algorithm is an important extension of the naive Bayesian classification algorithm.However,AODE treats all attributes equally,which limits its ability of improving classification performance.In order to accurately characterize the role of each attribute in classification and further improve the classification performance of AODE,this paper proposes a divergence weighted AODE classification algorithm.The method introduces two divergence metrics,Kullback-Leibler divergence and Jessen-Shannon divergence,and uses them to determine the weights of the super parent one-dependence estimators in the AODE classification framework.These weights are based on the divergence between the prior distribution of the class variable and the posterior distribution of the given attribute values.The results are in a more optimal way of combining the super parent one-dependence estimators.Experiments on 36 data sets from the University of California machine learning repository show that the divergence weighted AODE algorithm significantly outperforms the original AODE algorithm.Consequently,the use of divergence weighting can effectively improve the classification performance of AODE.