Boosting-based methods are effective for class imbalance problem, where the numbers of samples in two or more classes are severely unequal. However, the classifier weights of existing boosting-based methods are calculated by minimizing the error rate, which is inconsistent with the objective of class imbalance learning. As a result, the classifier weights cannot represent the performance of individual classifiers properly when the data is imbalanced. In this paper, we therefore propose a G-mean Optimized Boosting (GOBoost) framework to assign classifier weights optimized on G-mean. Subsequently, high weights are assigned to the classifier with high accuracy on both the majority class and the minority class. The GOBoost framework can be applied to any AdaBoost-based method for class imbalance learning by simply replacing the calculation of classifier weights. Accordingly, we extend six AdaBoost-based methods to GOBoost-based methods for comparative studies in class imbalance learning. The experiments conducted on 12 real class imbalance data sets show that GOBoost-based methods significantly outperform the corresponding AdaBoost-based methods in terms of F1 and G-mean metrics.