TY - GEN
T1 - GOBoost
T2 - 12th World Congress on Intelligent Control and Automation, WCICA 2016
AU - Lu, Yang
AU - CHEUNG, Yiu Ming
AU - Tang, Yuan Yan
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/9/27
Y1 - 2016/9/27
N2 - Boosting-based methods are effective for class imbalance problem, where the numbers of samples in two or more classes are severely unequal. However, the classifier weights of existing boosting-based methods are calculated by minimizing the error rate, which is inconsistent with the objective of class imbalance learning. As a result, the classifier weights cannot represent the performance of individual classifiers properly when the data is imbalanced. In this paper, we therefore propose a G-mean Optimized Boosting (GOBoost) framework to assign classifier weights optimized on G-mean. Subsequently, high weights are assigned to the classifier with high accuracy on both the majority class and the minority class. The GOBoost framework can be applied to any AdaBoost-based method for class imbalance learning by simply replacing the calculation of classifier weights. Accordingly, we extend six AdaBoost-based methods to GOBoost-based methods for comparative studies in class imbalance learning. The experiments conducted on 12 real class imbalance data sets show that GOBoost-based methods significantly outperform the corresponding AdaBoost-based methods in terms of F1 and G-mean metrics.
AB - Boosting-based methods are effective for class imbalance problem, where the numbers of samples in two or more classes are severely unequal. However, the classifier weights of existing boosting-based methods are calculated by minimizing the error rate, which is inconsistent with the objective of class imbalance learning. As a result, the classifier weights cannot represent the performance of individual classifiers properly when the data is imbalanced. In this paper, we therefore propose a G-mean Optimized Boosting (GOBoost) framework to assign classifier weights optimized on G-mean. Subsequently, high weights are assigned to the classifier with high accuracy on both the majority class and the minority class. The GOBoost framework can be applied to any AdaBoost-based method for class imbalance learning by simply replacing the calculation of classifier weights. Accordingly, we extend six AdaBoost-based methods to GOBoost-based methods for comparative studies in class imbalance learning. The experiments conducted on 12 real class imbalance data sets show that GOBoost-based methods significantly outperform the corresponding AdaBoost-based methods in terms of F1 and G-mean metrics.
UR - http://www.scopus.com/inward/record.url?scp=84991660109&partnerID=8YFLogxK
U2 - 10.1109/WCICA.2016.7578792
DO - 10.1109/WCICA.2016.7578792
M3 - Conference proceeding
AN - SCOPUS:84991660109
T3 - Proceedings of the World Congress on Intelligent Control and Automation (WCICA)
SP - 3149
EP - 3154
BT - Proceedings of the 2016 12th World Congress on Intelligent Control and Automation, WCICA 2016
PB - IEEE
Y2 - 12 June 2016 through 15 June 2016
ER -