首页|Distribution based ensemble for class imbalance learning
Distribution based ensemble for class imbalance learning
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
MultiBoost ensemble has been well acknowledged as an effective learning algorithm which able to reduce both bias and variance in error and has high generalization performance。 However, to deal with the class imbalanced learning, the Multi- Boost shall be amended。 In this paper, a new hybrid machine learning method called Distribution based MultiBoost (DBMB) for class imbalanced problems is proposed, which combines Distribution based balanced sampling with the MultiBoost algo- rithm to achieve better minority class performance。 It minimizes the within class and between class imbalance by learning and sampling different distributions (Gaussian and Poisson) and reduces bias and variance in error by employing the MultiBoost ensemble。 Therefore, DBMB could output the final strong learner that is more proficient ensemble of weak base learners for imbalanced data sets。 We prove that the G-mean, F1 measure and AUC of the DBMB is significantly superior to others。 The experimental verification has shown that the proposed DBMB outperforms other state-of-the-art algorithms on many real world class imbalanced problems。 Furthermore, our proposed method is scalable as compare to other boosting methods。
Class imbalance learningMultiBoostdistribution based resamplingensemble learning
Mustafa, Ghulam、Niu, Zhendong、Yousif, Abdallah、Tarus, John
展开 >
School of Computer Science and Technology, Beijing Institute of Technology, 5 South Zhongguancun Street, Beijing 100081, China
International Conference on Innovative Computing Technology
Galcia(ES)
2015 Fifth International Conference on Innovative Computing Technology