
Citation: Fumiaki Uchiumi, Makoto Fujikawa, Satoru Miyazaki, Sei-ichi Tanuma. Implication of bidirectional promoters containing duplicated GGAA motifs of mitochondrial function-associated genes[J]. AIMS Molecular Science, 2014, 1(1): 1-26. doi: 10.3934/molsci.2013.1.1
[1] | Kuntha Pin, Jung Woo Han, Yunyoung Nam . Retinal diseases classification based on hybrid ensemble deep learning and optical coherence tomography images. Electronic Research Archive, 2023, 31(8): 4843-4861. doi: 10.3934/era.2023248 |
[2] | Peng Ren, Qunli Xia . Classification method for imbalanced LiDAR point cloud based on stack autoencoder. Electronic Research Archive, 2023, 31(6): 3453-3470. doi: 10.3934/era.2023175 |
[3] | Qianpeng Xiao, Changbin Shao, Sen Xu, Xibei Yang, Hualong Yu . CCkEL: Compensation-based correlated k-labelsets for classifying imbalanced multi-label data. Electronic Research Archive, 2024, 32(5): 3038-3058. doi: 10.3934/era.2024139 |
[4] | Tej Bahadur Shahi, Cheng-Yuan Xu, Arjun Neupane, Dayle B. Fleischfresser, Daniel J. O'Connor, Graeme C. Wright, William Guo . Peanut yield prediction with UAV multispectral imagery using a cooperative machine learning approach. Electronic Research Archive, 2023, 31(6): 3343-3361. doi: 10.3934/era.2023169 |
[5] | Mengke Lu, Shang Gao, Xibei Yang, Hualong Yu . Improving performance of decision threshold moving-based strategies by integrating density-based clustering technique. Electronic Research Archive, 2023, 31(5): 2501-2518. doi: 10.3934/era.2023127 |
[6] | Mohd. Rehan Ghazi, N. S. Raghava . Securing cloud-enabled smart cities by detecting intrusion using spark-based stacking ensemble of machine learning algorithms. Electronic Research Archive, 2024, 32(2): 1268-1307. doi: 10.3934/era.2024060 |
[7] | Yike Lv, Xinzhu Meng . Evolution of infectious diseases induced by epidemic prevention publicity and interaction between heterogeneous strains. Electronic Research Archive, 2024, 32(8): 4858-4886. doi: 10.3934/era.2024223 |
[8] | Xiaoyan Wu, Guowen Ye, Yongming Liu, Zhuanzhe Zhao, Zhibo Liu, Yu Chen . Application of Improved Jellyfish Search algorithm in Rotate Vector reducer fault diagnosis. Electronic Research Archive, 2023, 31(8): 4882-4906. doi: 10.3934/era.2023250 |
[9] | Boshuo Geng, Jianxiao Ma, Shaohu Zhang . Ensemble deep learning-based lane-changing behavior prediction of manually driven vehicles in mixed traffic environments. Electronic Research Archive, 2023, 31(10): 6216-6235. doi: 10.3934/era.2023315 |
[10] | Chetan Swarup, Kamred Udham Singh, Ankit Kumar, Saroj Kumar Pandey, Neeraj varshney, Teekam Singh . Brain tumor detection using CNN, AlexNet & GoogLeNet ensembling learning approaches. Electronic Research Archive, 2023, 31(5): 2900-2924. doi: 10.3934/era.2023146 |
Learning from class imbalanced data is a hotspot and challenging issue in the field of machine learning [1,2]. Also, we note that the class imbalance problem exists widely in real-world applications, including biology data processing [3], business data analysis [4], industry fault detection [5], face recognition [6] and crime linkage discovery [7]. In these applications, the users generally focus on the minority class that has less training instances than the other classes. However, the conventional supervised learning algorithms always expect to minimize the overall training errors, and hence it is inevitable to favor the majority class but sacrifice the performance of the minority class.
Over the past two decades, many learning algorithms for addressing the imbalanced classification problem have been proposed. The methods mainly include sampling [8,9,10,11,12,13,14,15,16], cost-sensitive learning [17,18,19], threshold-moving [20,21,22], one-class learning [23] and ensemble learning [24,25,26,27,28,29,30,31] or multiple classifiers system [32,33,34,35,36,37,38,39]. Sampling is a data preprocessing strategy that is used to add the minority instances (oversampling) or remove the majority examples (undersampling) to balance the training set, further satisfying the learning rule of the conventional supervised learning algorithms. Cost-sensitive learning assigns higher penalty weights for minority instances to force the learning algorithms to pay more attention to the minority class. Threshold-moving is a post-processing solution which empirically or adaptively tunes the decision threshold to push the classification hyperplane towards the majority class. One-class learning, which describes the boundary of one class by merely training on the instances belonging to that class, is generally used to address a highly imbalanced problem. Ensemble learning aims to utilize the other class imbalance learning methods, e.g., sampling, cost-sensitive-cost learning or threshold-moving, to combine with the emerging ensemble learning paradigms, e.g., bagging or boosting, for classifying imbalanced data. In comparison to those single models, ensemble learning is expected to greatly improve the classification performance, especially the generalization ability, on class imbalanced data. In view of the merits of ensemble learning, it has been widely studied and adopted in the context of imbalanced data classification.
As we know, to make an ensemble classifier be effective, we should consider two key factors. The first one is that the performance of each base classifier should be not very poor, and the other one is that all base classifiers contained in an ensemble should be diverse as far as possible [40]. For the first factor, most class imbalance ensemble learning algorithms can guarantee it well by assigning excellently distributed training subsets and designating a robust supervised learning algorithm to train each base classifier. Meanwhile, for the second factor, i.e., diversity, many ensemble learning algorithms cannot satisfy it well, as most of them acquire diversity only by perturbing distributions of instances or costs in each training subset. Actually, only when there is a balance between the single performance and the group diversity, the corresponding ensemble model could present the strongest performance [38].
In this study, we focus on how to further improve the diversity of ensemble classifiers and how to find the best tradeoff between single performance and group diversity. For the first problem, we profit from the idea of [39] to combine multiple different sampling algorithms and multiple different types of classifiers to construct a heterogeneous ensemble learning algorithm. As indicated in [39], perturbing instances and classifiers simultaneously can help provide more diversity for an ensemble and meanwhile can improve the quality of the ensemble. As for the second problem, considering that one cannot simply estimate the potential and effect of each base classifier in ensemble by its single performance, an evolutionary genetic algorithm is adopted to adaptively select a high-quality combination of base classifiers for making the decision. According to the many could be better than all theory [41], the selective ensemble strategy must be helpful for searching a better tradeoff between single performance and group diversity. We call the proposed ensemble learning algorithm Selective Evolutionary Heterogeneous Ensemble, which can be called SEHE in brief. To verify the effectiveness and feasibility of the proposed SEHE algorithm, we compare it with lots of state-of-the-art class imbalance ensemble learning algorithms on 42 imbalanced data sets acquired in the Keel data repository [42] and UCI machine learning repository [43], and the results show the superiority of SEHE.
The main contribution in this study is combining the ideas of the heterogeneous ensemble and the selective ensemble to improve the performance of class imbalanced learning. To our best knowledge, it is the first time integrating these two ideas for addressing the class imbalanced learning problem. It not only increases diversity production but also helps to find the best tradeoff between single performance and group diversity
The remainder of this paper is organized as follows. Section 2 reviews the related work about class imbalance ensemble learning. Section 3 describes the proposed SEHE algorithm and its framework in detail. In Section 4, the experimental results and analysis are presented. Finally, Section 5 concludes this study and indicates the future research directions.
As mentioned above, lots of different techniques and solutions have been proposed to solve the class imbalance problem during the past two decades, in which the ensemble learning is most successful and popular to address such problem [1,2]. The general idea of class imbalance ensemble learning is to combine a single class imbalanced learning technique, e.g., sampling, cost-sensitive learning or threshold-moving, with one of the Bagging or Boosting ensemble learning paradigms for achieving the aim of improving the generalization capability of the classification model. Here, the adoption of a single technique aims to acquire an approximately unbiased base classifier, while the adoption of an ensemble aims to promote the robustness of the decision model.
Some primitive class imbalance ensemble learning algorithms, including UnderBagging [24], OverBagging [37], SMOTEBagging [37], RUSBoost [33], SMOTEBoost [25], EUSBoost [28], Asymmetric Bagging [36], etc., directly combine a simple undersampling or oversampling algorithm with a Bagging or Boosting ensemble framework. The merit of these algorithms lies in that they are simple and easily implemented. However, they produce diversity only by sampling different data subsets. It is clear that when adopting an undersampling strategy to generate training subsets, the diversity of training subsets could be highlighted, but the performance of each single base learning model could not be safely guaranteed. Meanwhile, oversampling safeguards the performance of a single model but always hurts the group diversity. Yu et al. [38] indicated that when and only when there exists an approximate tradeoff between single performance and group diversity, the quality of the ensemble could be maximized. They integrated several techniques, including feature selection, feature subspace, asymmetric bagging, into an ensemble algorithm called asBagging-FSS for classifying high-dimensional imbalanced biomedicine data. In contrast to Asymmetric Bagging [36], the asBagging-FSS simultaneously enhances the group diversity by inserting disturbances in feature space and the single performance by adding a feature selection procedure. However, the solution might be ineffective on low-dimensional class imbalanced data.
Another group of class imbalance ensemble learning algorithms consider coupling instance costs together with ensemble learning. In fact, the cost of an instance can be naturally combined with the Boosting ensemble learning paradigm. Some well-known algorithms integrating both cost-sensitive learning and Boosting include AdaCost [27], AdaC1 [34], AdaC2 [34] and AdaC3 [34]. These ensemble solutions gradually focus on classifying those difficult instances by the way of iterative feedback. In such fashion, the diversity between two continuous base classifiers can be seen as the variance of cost distributions.
Several algorithms also combine the threshold-moving technique with the Bagging ensemble learning paradigm, such as PT-Bagging [26] and EnSVM-OTHR [20]. Specifically, the PT-Bagging produces diversity only relying on bootstrap strategy which is a natural step in Bagging, while the EnSVM-OTHR adopts both bootstrap and a small random disturbance happening on the adjusted threshold to acquire diverse base classifiers.
To enhance the performance of a single model in an ensemble, some previous studies utilize the ensemble learning model to represent the single model, i.e., the algorithm can be seen as an ensemble of an ensemble. Such cases include EasyEnsemble [30], BalanceCascade [30] and GIR series algorithms [35]. Specifically, in GIR series algorithms, two sampling approaches convert the class imbalance problem into several balanced sub-problems for training the classifiers, and in each sub-problem, several weak classifiers are respectively trained by using Boosting method and finally combined by Bagging. These two-level ensemble algorithms ignore the effect of diversity, and hence the performance improvement is generally restricted.
In recent several years, more and more researchers began to focus on studying how to promote diversity in an ensemble. The strategies of enhancing ensemble diversity include evolutionary strategy [29,31] and heterogeneous ensemble [39]. Roshan and Asadi [31] adopted a popular evolutionary multi-objective optimization algorithm called NSGA-II to synchronously optimize the three following goals: class imbalance ratio IR in each bag, group diversity in ensemble and AUC performance metric. Lim et al. [29] presented a novel cluster-based synthetic oversampling (CSO) algorithm and integrated its five hyper-parameters in each bag together to implement chromosome encoding, finally taking advantage of a genetic algorithm (GA) to determine and produce training subsets. Due to the number of data to generate, the clustering method, the number of clusters, the number of nearest neighbors within each cluster for oversampling and the oversampling method are different in different bags, so there exists a significant diversity among those generated training subsets. Zefrehi and Altincay [39] investigated the possibility of integrating both heterogeneous sampling strategies and classification models to enhance diversity in an ensemble. A significant performance improvement has been observed by adopting heterogeneous classifiers in contrast to using homogeneous classifiers. Meanwhile, when the number of heterogeneous classifiers increases from 1 to 5, the performance of the ensemble model can continuously increase.
According to the reviews above, it is not difficult to observe that there are two major challenges for the class imbalance ensemble learning technique. The first one is how to further improve the group diversity, and the second one is how to achieve an approximately excellent tradeoff between single performance and group diversity. In this study, we wish to simultaneously focus on these two problems.
In this section, we first introduce the heterogeneous class imbalance ensemble learning method, and then on this basis, a selective evolutionary heterogeneous class imbalanced ensemble learning algorithm is described in detail.
As we know, developing a successful ensemble learning method should simultaneously focus on the two following conditions: the quality of single base learning model and the group diversity among different base learning models [40]. It is not difficult to generate high-quality single classifiers; however, it is a difficult task guaranteeing the generated base classifiers to be diverse enough. The reason lies in that diversity, as an abstract notion, is difficult to accurately measure. In addition, the methods of producing and/or enhancing diversity are generally naïve. In Bagging, it always adopts a bootstrap technique to produce diverse training subsets for acquiring diversity, and hence it guarantees to produce 36.8% diverse training instances between two different subsets in theory [44]. In contrast, the Boosting acquires diverse training subsets by disturbing the distribution of training instances or the weights of training instances [45], which is totally uncontrolled in practical applications. Random forest [46] and random subspace [47] enhance the diversity of training subsets by disturbing both instance space and feature space, but the diversity promotion would be limited by the dimension of the feature space. That is to say, if the training data is low-dimension, we could not significantly improve the diversity of training subsets by these two methods.
All methods mentioned above utilize data-level disturbance to produce diversity. However, it is noteworthy that the classifier level can produce diversity for an ensemble, too. That means the diversity source can be associated with adopting multiple types of classifiers. In general, we call this strategy heterogeneous ensemble. Several previous works have indicated that the diversity among heterogeneous members is significantly higher than that of homogeneous ones [48,49], and the heterogeneous ensemble always performs better than any one single member [50,51]. As for class imbalance learning, an ensemble algorithm combining heterogeneous sampling and heterogeneous classifiers had also been proposed by Zefrehi and Altincay [39] and had acquired improving classification results in comparison to some popular class imbalance ensemble learning algorithms. Profiting from the idea of the HeteroEn algorithm proposed in [39], we propose a heterogeneous ensemble (HE) algorithm for addressing the class imbalance learning problem.
Let Φ denote the class imbalanced training data set, and S = {s1, s2, …, sL} and K = {k1, k2, …, kN} respectively denote the set including L different sampling algorithms and the set containing N diverse types of classifiers. Then, we can produce L*N diverse combinations of training subsets and classification approaches, where each combination has one of two differences at the data level and classifier level from any other combination at least. In HE, the most popular and simplest decision rule, i.e., majority voting, is adopted to combine decisions of all base classifiers. The flow path of the HE algorithm is described below.
Algorithm 1: HE algorithm |
Input: A class imbalanced training set Φ, a sampling approaches set S = {s1, s2, …, sL}, a classification methods set K = {k1, k2, …, kN} |
Output: An ensemble classifier set E = {C11, C12, …, C1N, C21, C22, …, C2N, ……, CL1, CL2…CLN} |
Procedure: |
1. Initialize E = ϕ; |
2. for i = 1: L |
3. for j = 1:N |
4. generate a training subset Φij by adopting the sampling algorithm si acting on the training set Φ; |
5. train a base classifier Cij on the training subset Φij by adopting the classification method kj; |
6. put Cij into E; |
7. end |
8. end |
A more intuitive description about the flow path of the HE algorithm can be observed in Figure 1.
It is worthy to note that although both the HeteroEn algorithm proposed in [39] and our HE algorithm adopt the same construction, that is, the combination of diverse sampling approaches and heterogeneous classifiers to construct an ensemble, there is a significant difference between them. The difference lies in that the HeteroEn algorithm integrates the heterogeneous combination into Bagging and Boosting ensemble learning frameworks, while our proposed HE algorithm only produces L*N base classifiers, equal to the number of combinations constructed on the data level and classifier level. The merits of such way are two-fold: 1) It would not generate excessively redundant base classifiers which lack enough diversity with each other in the ensemble; 2) the size of the ensemble could be restricted in a small scale to help reduce time and storage consumption.
To make HE be effective, we also need to consider which sampling approaches and classification algorithms should be put into S and K, respectively. Obviously, to guarantee the quality of a single base classifier, each sampling and each classification algorithm in HE should be robust. Furthermore, to enhance the diversity, both sampling algorithms and classification models should be based on different theoretical rules and inherent assumptions. According to these two criterions, we select seven sampling approaches (including three undersampling algorithms: RUS [9], UFFDFR [12] and MPBU [13]; and four oversampling algorithms: ROS [9], SMOTE [11], MWMOTE [8] and GA-SMOTE [16]) and six classification models to constitute S and K in HE, respectively. These heterogeneous methods are shown in Table 1. In practical applications, the combination of these methods is not constant, and thus the users are encouraged to add and remove any sampling approach or classification model in this list.
sampling approach | classification method | ||
s1 | RUS [9] | k1 | decision tree (CART) |
s2 | UFFDFR [12] | k2 | K-nearest neighbors (KNN) |
s3 | MPBU [13] | k3 | support vector machine (SVM) with Gaussian kernel |
s4 | ROS [9] | k4 | extreme learning machine (ELM) with sigmoid activation function |
s5 | SMOTE [11] | k5 | logistic regression (LR) |
s6 | MWMOTE [8] | k6 | naïve Bayes classifier (NB) |
s7 | GA-SMOTE [16] |
In HE, although we have safeguarded the performance of the ensemble by adopting robust and diverse sampling approaches and classification methods, there is still a huge potential to further improve the quality of the ensemble. This is because each classification method has its specific inherent assumption, which causes it to perform well on some data but perform poorly on some other data. This phenomenon has been observed by many previous works [8,14,16]. Therefore, it is necessary to remove some low-quality single base classifiers for further enhancing the quality of the ensemble. However, considering that diversity is an abstract notion, it is difficult to directly estimate the effect and potential of any one base classifier in HE, and thus it can be regarded as a sophisticated optimization problem. This is consistent with the idea of selective ensemble [41], i.e., many could be better than all.
To solve the optimization problem above, we refer the idea in [41] to conduct a procedure of selective ensemble on HE by using an evolutionary genetic algorithm. The proposed algorithm is called selective evolutionary heterogeneous ensemble, which can be also called SEHE in brief. It is clear that the SEHE is an adaptive algorithm to avoid estimating the quality of each base classifier and the diversity among them.
The reason for selecting the genetic algorithm is that the optimization problem in SEHE is discrete, which is specifically appropriate for use with this optimization technique. There are several key conditions and parameters, which are chromosome encoding strategy, fitness function, population size, crossover factor, mutation factor and the termination condition, that should be seriously considered in a genetic algorithm [52]. In our proposed SEHE algorithm, the chromosome encoding is organized as a vector of length L*N to represent all base classifiers. In each position of the chromosome, it adopts a binary coding, where 1 indicates the corresponding base classifier joining into the final ensemble, while 0 expresses that the corresponding one should be excluded from the final ensemble. Each chromosome denotes an alternative of selective ensemble. As for the fitness function that is used to evaluate the quality of each selective ensemble solution, we consider that it should be directly associated with a performance evaluation metric. Here, we adopt a combination of two popular class imbalance learning performance evaluation metrics, i.e., F1-measure and G-mean, as the fitness function f. In particular, the F1-measure originating from F-measure can be calculated as below:
$ {\text{F - measure = }}\frac{{(1 + {\beta ^2}) \times {\text{recall}} \times {\text{precision}}}}{{{\beta ^2} \times {\text{recall + precision}}}} $ | (1) |
where β is a parameter denoting the relative importance between precision and recall. In general, β is set to be 1, and then the metric is transformed to be the widely used F1-measure, i.e.,
$ {\text{F1 - measure = }}\frac{{2 \times {\text{recall}} \times {\text{precision}}}}{{{\text{recall + precision}}}} $ | (2) |
G-mean can be calculated as below:
$ {\text{G - mean = }}\sqrt {{\text{TPR}} \times {\text{TNR}}} $ | (3) |
where TPR and TNR denote the true positive rate and true negative rate, respectively. Thus, the G-mean metric can be regarded as a tradeoff between TPR and TNR. Based on the F1-measure and G-mean, the fitness function f can be calculated as follows.
$ f{\text{ = }}\frac{1}{2} \times {\text{F1 - measure}} + \frac{1}{2} \times {\text{G - mean}} $ | (4) |
It is obvious that the fitness function f represents an impartial tradeoff between F1-measure and G-mean.
In addition, to keep the tradeoff between the quality of solution and running time consumption, the population size and terminal iterative times are empirically suggested to be designated as 50 and 30, respectively. Furthermore, to accelerate the exploration process, the crossover factor and mutation factor are respectively set to be 1.0 and 0.1 in SEHE. That means all parents conduct crossover operation to generate their offspring, and random 10% of positions of all chromosomes transform their coding values.
The flow path of the proposed evolutionary algorithm is presented as below.
Algorithm 2: Evolutionary algorithm |
Input: An ensemble classifier set E which includes L*N base classifiers, the population size parameter Npop, a crossover factor Nc, a mutation factor Nm, the terminal iterative times Ns |
Output: A selective ensemble set SE |
Procedure: |
1. Initialize a binary vector Vbest with length L*N randomly; |
2. Initialize a variance fbest, which is used to reserve the current best fitness value, as 0; |
3. Initialize Npop chromosomes with the length L*N using binary coding randomly; |
4. for i = 1: Ns |
5. transform each chromosome to an ensemble according to its coder mapping on E, and estimate its quality by the fitness function, then compare it with fbest, if there exists a better fitness value, it would be used to replace fbest, and meanwhile the corresponding chromosome would be used to update Vbest; |
6. end |
7. decode Vbest on E to acquire SE. |
8. end |
In addition, we note that the fitness function of evolutionary algorithm associates with both F1-measure and G-mean metrics, and then if both the training procedure of HE and the calculation of fitness of evolutionary algorithm are conducted directly on the original training set Φ, the selected combination of base classifiers might be overfit. To solve the problem, we adopt an internal five-fold cross validation strategy in our SEHE algorithm, in which the original training set E is averagely divided into five non-overlapping subsets Φ1, Φ2, Φ3, Φ4 and Φ5. First, we conduct a five-round iterative operation in which at each round, four subsets are integrated as the internal training set, and the remaining one is used as the internal verifying set. Then, on each internal training set, we conduct the HE algorithm to produce L*N classifiers. That means 5*L*N classifiers are generated in total. Next, for each chromosome in SEHE, its fitness is estimated five times, in which for the ith time, the corresponding classifiers trained on the ith internal training set are used to evaluate the chromosome's fitness on the ith internal verifying set. Furthermore, we adopt the average of the five results as the fitness of the chromosome. Finally, to avoid wasting the training instances, it is required to retrain the best combination of classifiers on the original training set Φ after finishing the evolutionary procedure. Based on this rule, the flow path of the SEHE algorithm can be described as follows.
From the flow path of the SEHE algorithm, it is not difficult to observe that it is implemented by a two-level training procedure, where the first level runs in a five-fold cross validation environment to search for the best combination of sampling approaches and classification methods, while the second level takes charge of retraining this combination on the original training set to produce the selective ensemble result. Due to the adoption of a two-level training mode and evolutionary algorithm, it is inevitable that SEHE is a time-consuming algorithm.
Algorithm 3: SEHE algorithm |
Input: A class imbalance training set Φ, a sampling approaches set S = {s1, s2, …, sL}, a classification methods set K = {k1, k2, …, kN}, the population size parameter Npop, a crossover factor Nc, a mutation factor Nm, the terminal iterative times Ns |
Output: A selective ensemble set SE |
Procedure: |
1. divide randomly Φ into five average subsets Φ1, Φ2, Φ3, Φ4 and Φ5; |
2. for i = 1:5 |
3. integrate all training subsets except Φi as training set Ψi; |
4. call HE algorithms to produce an ensemble classifiers set Ei on Ψi; |
5. produce the output of each base classifier in Ei on each instance in the verifying set Φi, and record them; |
6. end |
7. call evolutionary algorithm to get a binary vector indicating which combinations of sampling approaches and classification methods should be reserved; |
8. adopt the reserved combinations to retrain a selective ensemble set SE on Φ. |
We collected 42 imbalanced data sets, including 30 ones from Keel data repository [42] and 12 ones from UCI machine learning repository [43], to verify the effectiveness and superiority of the proposed SEHE algorithm. Specifically, these datasets have different numbers of instances, numbers of features and class imbalance ratios. The information about these data sets is described in Table 2 in detail.
Dataset | #Instances | #Features | IR | Dataset | #Instances | #Features | IR |
glass1 | 214 | 9 | 1.82 | ecoli-0-6-7_vs_5 | 220 | 6 | 10.00 |
wisconsin | 683 | 9 | 1.86 | led7digit-0-2-4-5-6-7-8-9_vs_1 | 443 | 7 | 10.97 |
pima | 768 | 8 | 1.87 | ecoli-0-1_vs_5 | 240 | 6 | 11.00 |
haberman | 306 | 3 | 2.78 | shuttle-6_vs_2-3 | 230 | 9 | 22.00 |
vehicle1 | 846 | 18 | 2.90 | flare-F | 1066 | 11 | 23.79 |
new-thyroid1 | 215 | 5 | 5.14 | winequality-red-4 | 1599 | 11 | 29.17 |
yeast3 | 1484 | 8 | 8.10 | shuttle-2_vs_5 | 3316 | 9 | 66.67 |
ecoli3 | 336 | 7 | 8.60 | poker-8-9_vs_5 | 2075 | 10 | 82.00 |
vowel0 | 988 | 13 | 9.98 | poker-8_vs_6 | 1477 | 10 | 85.88 |
yeast-1_vs_7 | 459 | 7 | 14.30 | banknote | 1372 | 4 | 1.25 |
ecoli4 | 336 | 7 | 15.80 | ctgC5 | 2126 | 21 | 28.50 |
abalone9-18 | 731 | 8 | 16.40 | ctgN1vsN3 | 2126 | 21 | 3.50 |
shuttle-c2-vs-c4 | 129 | 9 | 20.90 | ctgN2 | 2126 | 21 | 6.20 |
yeast4 | 1484 | 8 | 28.10 | ctgN3 | 2126 | 21 | 11.10 |
yeast5 | 1484 | 8 | 32.73 | wilt | 4839 | 5 | 17.50 |
abalone19 | 4174 | 8 | 129.44 | mfeatmor01 | 2000 | 6 | 4.00 |
ecoli-0-3-4_vs_5 | 200 | 7 | 9.00 | mfeatmor012 | 2000 | 6 | 2.30 |
ecoli-0-6-7_vs_3-5 | 222 | 7 | 9.09 | seeds2v13 | 210 | 7 | 2.00 |
yeast-0-3-5-9_vs_7-8 | 506 | 8 | 9.12 | segment1 | 2310 | 19 | 6.00 |
yeast-0-3-5-9_vs_7-8 | 1004 | 8 | 9.14 | segment12 | 2310 | 19 | 2.50 |
ecoli-0-1_vs_2-3-5 | 244 | 7 | 9.17 | segment123 | 2310 | 19 | 1.30 |
To estimate whether the proposed SEHE algorithm is effective and feasible, it is widely compared with some popular and state-of-the-art class imbalance ensemble learning algorithms, including UnderBagging [24], OverBagging [37], SMOTEBagging [37], RUSBoost [33], SMOTEBoost [25], EUSBoost [28], EnSVM-OTHR [20], EasyEnsemble [30], BalanceCascade [30], GIREnUS [35], 2Obj* [31], ECO-Ensemble [29], HeteroEn [39] and our proposed HE algorithm. Considering that the parameter settings in any algorithm could affect its performance and further influence the impartiality of experiment comparisons, we adopted the recommended parameter settings referred to in each study as default ones. For examples, in [35], we selected GIREnUS, which had been indicated to perform better than its partner, i.e., GIREnOS, while in [31], we adopted the 2Obj* version that outperforms two other algorithm versions. As for the number of base classifiers, all Bagging- and Boosting- based algorithms are uniformly designated as 100 and 20, respectively. In addition, the base classifier types in each ensemble algorithm adopted the default ones used in the corresponding study. For our proposed HE and SEHE algorithm, all sampling algorithms contained in them used the default parameters recommended in the corresponding references, and the classification algorithms used the following parameter settings: The KNN adopted a default parameter k = 5, the SVM used a RBF kernel function with σ = 0.01 and the penalty factor C = 1, the ELM adopted the sigmoid activation function with L = 50 hidden nodes, and the LR used L2 regularization with a default penalty factor C = 1. To guarantee the impartiality of experiments, if some other ensemble learning algorithms have adopted one or several sampling or classification algorithms that are the same as the ones used in our proposed algorithm, they have been designated the same parameter settings.
Both F1-measure and G-mean metrics which have been used to evaluate the fitness in our proposed SEHE algorithm are also selected as the performance evaluation metric for comparing various ensemble algorithms. In particular, both F1-metric and G-mean have been shown as the most popular metrics to evaluate the quality of a class imbalance learning algorithm.
Finally, to impartially compare the quality of all comparative algorithms, 10 random runs' external five-fold cross validation is conducted for each algorithm, and the average performance of the corresponding 50 validated results is used to represent its final result.
Tables 3 and 4 present the comparative results of 15 ensemble learning algorithms in terms of F1-measure and G-mean metrics, respectively. Specifically, on each data set, the best result has been highlighted in boldface.
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.6179 | 0.6418 | 0.6566 | 0.6531 | 0.6479 | 0.6710 | 0.7022 | 0.6937 | 0.6819 | 0.7074 | 0.7196 | 0.7338 | 0.7292 | 0.7231 | 0.7479 |
wisconsin | 0.9496 | 0.9501 | 0.9532 | 0.9458 | 0.9526 | 0.9531 | 0.9368 | 0.9444 | 0.9496 | 0.9537 | 0.9486 | 0.9529 | 0.9501 | 0.9529 | 0.9685 |
pima | 0.6074 | 0.6291 | 0.6399 | 0.6518 | 0.6436 | 0.6457 | 0.6706 | 0.6619 | 0.6538 | 0.6560 | 0.6489 | 0.6917 | 0.6742 | 0.6721 | 0.6898 |
haberman | 0.4310 | 0.4036 | 0.4117 | 0.4425 | 0.4328 | 0.4679 | 0.4658 | 0.4631 | 0.4254 | 0.4738 | 0.4699 | 0.4852 | 0.4798 | 0.4881 | 0.5122 |
vehicle1 | 0.6829 | 0.6473 | 0.6428 | 0.6716 | 0.6632 | 0.6811 | 0.6793 | 0.6627 | 0.6519 | 0.6521 | 0.7044 | 0.6759 | 0.6792 | 0.6721 | 0.7022 |
new-thyroid1 | 0.9275 | 0.9310 | 0.9288 | 0.9219 | 0.9468 | 0.9390 | 0.9426 | 0.9410 | 0.9272 | 0.9587 | 0.9446 | 0.9510 | 0.9451 | 0.9439 | 0.9517 |
yeast3 | 0.6866 | 0.6922 | 0.7014 | 0.7250 | 0.7033 | 0.7315 | 0.7446 | 0.7361 | 0.7410 | 0.7398 | 0.7316 | 0.7425 | 0.7355 | 0.7398 | 0.7461 |
ecoli3 | 0.5632 | 0.5791 | 0.6318 | 0.6007 | 0.6101 | 0.6049 | 0.6251 | 0.5735 | 0.5816 | 0.5835 | 0.5961 | 0.6032 | 0.6397 | 0.6381 | 0.6441 |
vowel0 | 0.9489 | 0.9876 | 0.9941 | 0.9774 | 0.9986 | 0.9826 | 0.9978 | 0.9598 | 0.9689 | 0.9936 | 0.9972 | 1.0000 | 0.9983 | 0.9973 | 1.0000 |
yeast-1_vs_7 | 0.2816 | 0.2459 | 0.4109 | 0.3828 | 0.4350 | 0.3744 | 0.3610 | 0.2969 | 0.2871 | 0.3980 | 0.3792 | 0.4160 | 0.3967 | 0.4159 | 0.4287 |
ecoli4 | 0.6491 | 0.7387 | 0.7519 | 0.6393 | 0.7275 | 0.7080 | 0.7441 | 0.6146 | 0.7269 | 0.7484 | 0.7175 | 0.7290 | 0.7448 | 0.7641 | 0.7548 |
abalone9-18 | 0.3353 | 0.3798 | 0.4314 | 0.3722 | 0.3918 | 0.3816 | 0.4277 | 0.2934 | 0.2987 | 0.3970 | 0.4052 | 0.4329 | 0.4320 | 0.4388 | 0.4530 |
shuttle-c2-vs-c4 | 0.9964 | 0.9981 | 1.0000 | 0.9981 | 1.0000 | 0.9993 | 0.9976 | 0.9952 | 0.9971 | 0.9994 | 1.0000 | 0.9972 | 0.9991 | 1.0000 | 1.0000 |
yeast4 | 0.2798 | 0.3362 | 0.3517 | 0.3394 | 0.3707 | 0.3239 | 0.3949 | 0.2875 | 0.2987 | 0.3708 | 0.3646 | 0.3802 | 0.3569 | 0.3694 | 0.3743 |
yeast5 | 0.5967 | 0.6652 | 0.6743 | 0.5969 | 0.6837 | 0.6345 | 0.6874 | 0.5350 | 0.5989 | 0.6947 | 0.6528 | 0.6777 | 0.6931 | 0.6895 | 0.6933 |
abalone19 | 0.0372 | 0.0991 | 0.0869 | 0.0417 | 0.1132 | 0.0507 | 0.0962 | 0.0487 | 0.0396 | 0.1133 | 0.1521 | 0.1312 | 0.1479 | 0.1455 | 0.1752 |
ecoli-0-3-4_vs_5 | 0.7109 | 0.7983 | 0.8109 | 0.7329 | 0.8086 | 0.7751 | 0.8108 | 0.6954 | 0.7374 | 0.7930 | 0.8028 | 0.8039 | 0.8184 | 0.7999 | 0.8481 |
ecoli-0-6-7_vs_3-5 | 0.5819 | 0.6343 | 0.6420 | 0.5927 | 0.6588 | 0.6447 | 0.6736 | 0.5210 | 0.6001 | 0.6542 | 0.6571 | 0.6848 | 0.6352 | 0.6572 | 0.6791 |
yeast-0-3-5-9_vs_7-8 | 0.2933 | 0.3231 | 0.3490 | 0.2815 | 0.3140 | 0.3392 | 0.3431 | 0.3089 | 0.3076 | 0.3418 | 0.3396 | 0.3210 | 0.3317 | 0.3396 | 0.3991 |
yeast-0-3-5-9_vs_7-8 | 0.6956 | 0.7722 | 0.7630 | 0.6717 | 0.7448 | 0.7134 | 0.7520 | 0.6837 | 0.7474 | 0.7541 | 0.7498 | 0.7526 | 0.7279 | 0.7579 | 0.7890 |
ecoli-0-1_vs_2-3-5 | 0.5549 | 0.6627 | 0.6788 | 0.5610 | 0.6472 | 0.6081 | 0.6654 | 0.6344 | 0.6290 | 0.6708 | 0.6587 | 0.6842 | 0.6738 | 0.6688 | 0.7223 |
ecoli-0-6-7_vs_5 | 0.5345 | 0.6860 | 0.7438 | 0.6633 | 0.7297 | 0.7515 | 0.7337 | 0.6125 | 0.7120 | 0.6995 | 0.7498 | 0.7355 | 0.7378 | 0.7394 | 0.7511 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.6627 | 0.7413 | 0.7427 | 0.6580 | 0.7336 | 0.7189 | 0.7355 | 0.6528 | 0.7066 | 0.7209 | 0.7515 | 0.7320 | 0.7458 | 0.7468 | 0.7515 |
ecoli-0-1_vs_5 | 0.7440 | 0.7752 | 0.7836 | 0.7506 | 0.7878 | 0.7441 | 0.7642 | 0.6784 | 0.7043 | 0.7531 | 0.7752 | 0.7807 | 0.7735 | 0.7742 | 0.7835 |
shuttle-6_vs_2-3 | 0.8929 | 0.7304 | 0.7543 | 0.8891 | 0.7984 | 0.8894 | 0.8906 | 0.8814 | 0.8839 | 0.8539 | 0.8901 | 0.8872 | 0.8756 | 0.8795 | 0.9194 |
flare-F | 0.1909 | 0.2542 | 0.2798 | 0.2590 | 0.2711 | 0.2433 | 0.2677 | 0.2516 | 0.2702 | 0.2710 | 0.2688 | 0.2854 | 0.2860 | 0.2991 | 0.3796 |
winequality-red-4 | 0.1277 | 0.1792 | 0.2098 | 0.1744 | 0.1811 | 0.1795 | 0.1811 | 0.1342 | 0.0970 | 0.1979 | 0.1594 | 0.1945 | 0.1986 | 0.2333 | 0.2691 |
shuttle-2_vs_5 | 1.0000 | 0.9951 | 1.0000 | 0.9729 | 0.9982 | 1.0000 | 1.0000 | 0.9684 | 0.9797 | 0.9968 | 0.9981 | 1.0000 | 0.9952 | 0.9954 | 1.0000 |
poker-8-9_vs_5 | 0.0372 | 0.0661 | 0.0971 | 0.0422 | 0.1157 | 0.0679 | 0.1011 | 0.0385 | 0.0406 | 0.0920 | 0.1008 | 0.1106 | 0.1033 | 0.1176 | 0.1795 |
poker-8_vs_6 | 0.2906 | 0.6790 | 0.6929 | 0.2409 | 0.6508 | 0.4411 | 0.4076 | 0.1919 | 0.2143 | 0.5080 | 0.6945 | 0.6991 | 0.6876 | 0.6734 | 0.7394 |
banknote | 0.9828 | 0.9816 | 0.9830 | 0.9901 | 0.9749 | 0.9826 | 0.9724 | 0.9849 | 0.9751 | 0.9810 | 0.9886 | 0.9815 | 0.9843 | 0.9799 | 0.9896 |
ctgC5 | 0.2820 | 0.3144 | 0.3462 | 0.2517 | 0.3727 | 0.2342 | 0.3109 | 0.3519 | 0.3442 | 0.3031 | 0.3572 | 0.3648 | 0.3528 | 0.3591 | 0.4327 |
ctgN1vsN3 | 0.6542 | 0.7310 | 0.7479 | 0.6968 | 0.7444 | 0.7991 | 0.7580 | 0.7374 | 0.7511 | 0.7332 | 0.7546 | 0.7590 | 0.7376 | 0.7521 | 0.7986 |
ctgN2 | 0.6378 | 0.7436 | 0.7387 | 0.7115 | 0.7220 | 0.7159 | 0.7034 | 0.7519 | 0.7226 | 0.7258 | 0.7196 | 0.7364 | 0.7419 | 0.7644 | 0.7428 |
ctgN3 | 0.7476 | 0.8095 | 0.7964 | 0.7555 | 0.7858 | 0.7336 | 0.7686 | 0.7587 | 0.7462 | 0.7680 | 0.7904 | 0.7725 | 0.7531 | 0.7549 | 0.8073 |
wilt | 0.5625 | 0.6991 | 0.7415 | 0.6423 | 0.7404 | 0.6985 | 0.7840 | 0.7379 | 0.6581 | 0.8132 | 0.7310 | 0.7796 | 0.7529 | 0.7522 | 0.8461 |
mfeatmor01 | 0.9089 | 0.9735 | 0.9818 | 0.9527 | 0.9886 | 0.9532 | 0.9897 | 0.9436 | 0.9528 | 0.9789 | 0.9901 | 0.9592 | 0.9301 | 0.9459 | 0.9711 |
mfeatmor012 | 0.8546 | 0.8369 | 0.8226 | 0.8990 | 0.8467 | 0.8971 | 0.8858 | 0.8442 | 0.8325 | 0.8790 | 0.8687 | 0.8884 | 0.8628 | 0.8844 | 0.9122 |
seeds2v13 | 0.9776 | 0.9664 | 0.9685 | 0.9702 | 0.9573 | 0.9696 | 0.9668 | 0.9794 | 0.9801 | 0.9642 | 0.9715 | 0.9733 | 0.9598 | 0.9696 | 0.9798 |
segment1 | 0.8806 | 0.9428 | 0.9335 | 0.9010 | 0.9552 | 0.9336 | 0.9344 | 0.9522 | 0.9121 | 0.9258 | 0.9306 | 0.9357 | 0.9281 | 0.9194 | 0.9479 |
segment12 | 0.9562 | 0.9314 | 0.9210 | 0.9668 | 0.9506 | 0.9632 | 0.9545 | 0.9619 | 0.9454 | 0.9508 | 0.9612 | 0.9675 | 0.9449 | 0.9491 | 0.9677 |
segment123 | 0.9986 | 0.9847 | 0.9859 | 0.9987 | 0.9856 | 0.9981 | 1.0000 | 0.9992 | 0.9992 | 0.9983 | 0.9869 | 1.0000 | 0.9906 | 0.9984 | 1.0000 |
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.7211 | 0.7202 | 0.7198 | 0.7454 | 0.7219 | 0.7418 | 0.7529 | 0.7622 | 0.7619 | 0.7588 | 0.7641 | 0.7832 | 0.7699 | 0.7701 | 0.8109 |
wisconsin | 0.9796 | 0.9701 | 0.9732 | 0.9769 | 0.9680 | 0.9745 | 0.9761 | 0.9687 | 0.9594 | 0.9732 | 0.9858 | 0.9761 | 0.9752 | 0.9747 | 0.9830 |
pima | 0.6733 | 0.6566 | 0.6721 | 0.6831 | 0.6702 | 0.6801 | 0.7673 | 0.7328 | 0.7285 | 0.7271 | 0.7166 | 0.7330 | 0.7575 | 0.7520 | 0.7793 |
haberman | 0.6010 | 0.5994 | 0.6004 | 0.6002 | 0.6118 | 0.6233 | 0.6219 | 0.6258 | 0.6146 | 0.6375 | 0.5998 | 0.6425 | 0.6333 | 0.6379 | 0.6412 |
vehicle1 | 0.7858 | 0.7741 | 0.7759 | 0.7901 | 0.7722 | 0.7910 | 0.7853 | 0.7766 | 0.7692 | 0.7581 | 0.7834 | 0.8022 | 0.7931 | 0.8018 | 0.8356 |
new-thyroid1 | 0.9792 | 0.9801 | 0.9847 | 0.9818 | 0.9836 | 0.9790 | 0.9829 | 0.9715 | 0.9718 | 0.9844 | 0.9901 | 0.9774 | 0.9855 | 0.9882 | 0.9899 |
yeast3 | 0.8993 | 0.8899 | 0.8706 | 0.8944 | 0.9052 | 0.8911 | 0.9237 | 0.8901 | 0.8877 | 0.9018 | 0.9134 | 0.9075 | 0.8969 | 0.8993 | 0.9105 |
ecoli3 | 0.8580 | 0.8681 | 0.8706 | 0.8478 | 0.8644 | 0.8590 | 0.8647 | 0.8760 | 0.8791 | 0.8632 | 0.8661 | 0.8744 | 0.8709 | 0.8776 | 0.8924 |
vowel0 | 0.9978 | 0.9963 | 0.9981 | 0.9972 | 0.9950 | 0.9981 | 1.0000 | 0.9935 | 0.9962 | 1.0000 | 0.9971 | 1.0000 | 0.9994 | 0.9991 | 1.0000 |
yeast-1_vs_7 | 0.5678 | 0.6882 | 0.7109 | 0.5945 | 0.7208 | 0.7044 | 0.6786 | 0.7430 | 0.7238 | 0.7196 | 0.7200 | 0.6959 | 0.7006 | 0.7159 | 0.7455 |
ecoli4 | 0.9196 | 0.8998 | 0.9106 | 0.9042 | 0.9119 | 0.9208 | 0.9198 | 0.9304 | 0.9196 | 0.9211 | 0.9307 | 0.9256 | 0.9388 | 0.9306 | 0.9329 |
abalone9-18 | 0.6495 | 0.6784 | 0.6513 | 0.6690 | 0.6755 | 0.6821 | 0.7492 | 0.7326 | 0.7419 | 0.7430 | 0.7374 | 0.7513 | 0.7333 | 0.7420 | 0.7764 |
shuttle-c2-vs-c4 | 0.9971 | 0.9984 | 1.0000 | 0.9975 | 1.0000 | 0.9995 | 0.9977 | 0.9941 | 0.9986 | 0.9997 | 1.0000 | 0.9982 | 0.9995 | 1.0000 | 1.0000 |
yeast4 | 0.8139 | 0.8057 | 0.8236 | 0.8294 | 0.8175 | 0.8277 | 0.8310 | 0.8194 | 0.8196 | 0.8079 | 0.8221 | 0.8174 | 0.8395 | 0.8358 | 0.8379 |
yeast5 | 0.9535 | 0.9457 | 0.9674 | 0.9521 | 0.9443 | 0.9576 | 0.9495 | 0.9478 | 0.9406 | 0.9528 | 0.9663 | 0.9712 | 0.9591 | 0.9544 | 0.9642 |
abalone19 | 0.3447 | 0.7240 | 0.7461 | 0.3290 | 0.6973 | 0.7276 | 0.5969 | 0.7085 | 0.4919 | 0.6738 | 0.7092 | 0.7329 | 0.7170 | 0.7181 | 0.7702 |
ecoli-0-3-4_vs_5 | 0.8781 | 0.8857 | 0.8796 | 0.8981 | 0.8848 | 0.8932 | 0.8921 | 0.8830 | 0.8829 | 0.8990 | 0.8998 | 0.8776 | 0.8939 | 0.8901 | 0.8957 |
ecoli-0-6-7_vs_3-5 | 0.8306 | 0.7998 | 0.8081 | 0.8272 | 0.8198 | 0.8255 | 0.8551 | 0.8030 | 0.8212 | 0.8187 | 0.8190 | 0.8464 | 0.8211 | 0.8186 | 0.8539 |
yeast-0-3-5-9_vs_7-8 | 0.6997 | 0.5687 | 0.5868 | 0.6739 | 0.5710 | 0.6845 | 0.6861 | 0.6932 | 0.6911 | 0.6954 | 0.6878 | 0.6759 | 0.6922 | 0.6878 | 0.7154 |
yeast-0-3-5-9_vs_7-8 | 0.8874 | 0.9028 | 0.9012 | 0.8966 | 0.8998 | 0.8775 | 0.8964 | 0.8981 | 0.8942 | 0.8911 | 0.8886 | 0.8932 | 0.8897 | 0.8933 | 0.8959 |
ecoli-0-1_vs_2-3-5 | 0.8377 | 0.8119 | 0.8088 | 0.8394 | 0.8299 | 0.8301 | 0.8442 | 0.8527 | 0.8504 | 0.8425 | 0.8501 | 0.8499 | 0.8455 | 0.8479 | 0.8697 |
ecoli-0-6-7_vs_5 | 0.8000 | 0.8642 | 0.8591 | 0.8320 | 0.8529 | 0.8617 | 0.8832 | 0.8653 | 0.8724 | 0.8695 | 0.8610 | 0.8736 | 0.8575 | 0.8531 | 0.8792 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.8534 | 0.8631 | 0.8670 | 0.8498 | 0.8692 | 0.8506 | 0.8772 | 0.8650 | 0.8762 | 0.8696 | 0.8779 | 0.8699 | 0.8858 | 0.8744 | 0.8821 |
ecoli-0-1_vs_5 | 0.8850 | 0.8952 | 0.8926 | 0.8901 | 0.8974 | 0.8833 | 0.8906 | 0.8864 | 0.8732 | 0.8830 | 0.8868 | 0.8953 | 0.8862 | 0.8934 | 0.9195 |
shuttle-6_vs_2-3 | 0.8586 | 0.7760 | 0.7793 | 0.8228 | 0.7896 | 0.8499 | 0.8934 | 0.8820 | 0.8959 | 0.8876 | 0.9030 | 0.9123 | 0.8896 | 0.8867 | 0.9095 |
flare-F | 0.8218 | 0.7477 | 0.7652 | 0.8333 | 0.7501 | 0.8198 | 0.6744 | 0.8219 | 0.7946 | 0.8079 | 0.8139 | 0.8448 | 0.7989 | 0.8016 | 0.8337 |
winequality-red-4 | 0.5919 | 0.6732 | 0.6673 | 0.5792 | 0.6545 | 0.5808 | 0.6273 | 0.5898 | 0.5170 | 0.6914 | 0.6632 | 0.6500 | 0.6705 | 0.6829 | 0.7173 |
shuttle-2_vs_5 | 1.0000 | 0.9982 | 1.0000 | 0.9711 | 0.9976 | 1.0000 | 1.0000 | 0.9774 | 0.9823 | 0.9954 | 0.9987 | 1.0000 | 0.9965 | 0.9988 | 1.0000 |
poker-8-9_vs_5 | 0.3317 | 0.6053 | 0.6149 | 0.3140 | 0.5782 | 0.3438 | 0.5562 | 0.3710 | 0.3929 | 0.4618 | 0.4491 | 0.5312 | 0.4700 | 0.4854 | 0.6157 |
poker-8_vs_6 | 0.5252 | 0.7410 | 0.7299 | 0.5740 | 0.7318 | 0.5237 | 0.7098 | 0.6536 | 0.6790 | 0.7492 | 0.7008 | 0.7129 | 0.7190 | 0.7152 | 0.7733 |
banknote | 0.9876 | 0.9810 | 0.9809 | 0.9913 | 0.9804 | 0.9821 | 0.9856 | 0.9830 | 0.9777 | 0.9822 | 0.9896 | 0.9811 | 0.9834 | 0.9791 | 0.9850 |
ctgC5 | 0.5134 | 0.6658 | 0.6233 | 0.5730 | 0.6419 | 0.5099 | 0.5858 | 0.5736 | 0.6297 | 0.6419 | 0.6322 | 0.6785 | 0.6412 | 0.6430 | 0.6859 |
ctgN1vsN3 | 0.8297 | 0.8170 | 0.8254 | 0.8137 | 0.8350 | 0.8445 | 0.8454 | 0.8361 | 0.8299 | 0.8317 | 0.8471 | 0.8360 | 0.8338 | 0.8415 | 0.8447 |
ctgN2 | 0.8052 | 0.7969 | 0.8011 | 0.7994 | 0.8138 | 0.7876 | 0.8452 | 0.8092 | 0.7956 | 0.8004 | 0.8123 | 0.8223 | 0.7964 | 0.8558 | 0.8770 |
ctgN3 | 0.8211 | 0.8732 | 0.8536 | 0.8341 | 0.8840 | 0.8109 | 0.8444 | 0.8277 | 0.8309 | 0.8650 | 0.8233 | 0.8766 | 0.8427 | 0.8501 | 0.8766 |
wilt | 0.7691 | 0.7038 | 0.6852 | 0.7222 | 0.7001 | 0.7559 | 0.7981 | 0.7226 | 0.7028 | 0.8006 | 0.7304 | 0.7578 | 0.7449 | 0.7611 | 0.7999 |
mfeatmor01 | 0.9521 | 0.9815 | 0.9874 | 0.9662 | 0.9901 | 0.9633 | 0.9884 | 0.9601 | 0.9732 | 0.9807 | 0.9874 | 0.9663 | 0.9406 | 0.9535 | 0.9715 |
mfeatmor012 | 0.8998 | 0.8227 | 0.8376 | 0.9015 | 0.8778 | 0.9192 | 0.9135 | 0.9004 | 0.8858 | 0.9030 | 0.8564 | 0.9002 | 0.8568 | 0.8774 | 0.9323 |
seeds2v13 | 0.9855 | 0.9724 | 0.9789 | 0.9802 | 0.9693 | 0.9891 | 0.9868 | 0.9780 | 0.9876 | 0.9905 | 0.9872 | 0.9901 | 0.9701 | 0.9743 | 0.9856 |
segment1 | 0.9423 | 0.9637 | 0.9716 | 0.9332 | 0.9630 | 0.9236 | 0.9579 | 0.9411 | 0.9619 | 0.9690 | 0.9848 | 0.9597 | 0.9481 | 0.9577 | 0.9700 |
segment12 | 0.9701 | 0.9539 | 0.9611 | 0.9678 | 0.9555 | 0.9690 | 0.9772 | 0.9571 | 0.9580 | 0.9747 | 0.9689 | 0.9677 | 0.9704 | 0.9708 | 0.9882 |
segment123 | 0.9991 | 0.9897 | 0.9879 | 0.9994 | 0.9834 | 0.9990 | 1.0000 | 0.9995 | 0.9994 | 0.9989 | 0.9881 | 1.0000 | 0.9935 | 0.9991 | 1.0000 |
From the results in Tables 3 and 4, it is not difficult to observe that our proposed SEHE algorithm outperforms all other competitors, as it produces best F1-measure result on 24 data sets and best G-mean result on 22 data sets, which are much more than the number of the best results acquired by adopting other class imbalance ensemble learning algorithms. We note that no matter if it is on those data sets with low class imbalance ratio or on highly imbalanced data sets, our proposed SEHE algorithm can produce good classification results. We believe that it associates with the two following reasons: 1) The combination of heterogeneous sampling methods and heterogeneous classification models provides enough diversity for constructing ensemble learning model, and 2) the evolutionary selection procedure in SEHE algorithm adaptively meets the demands of data distribution well. Therefore, we can say that the proposed SEHE algorithm is a robust, flexible and self-adaptive class imbalance learning algorithm.
Another interesting conclusion that can be observed from the results in Tables 3 and 4 is that several state-of-the-art algorithms proposed in recent several years, including EnSVM-OTHR, GIREnUS, 2Obj*, ECO-Ensemble and HeteroEn, generally perform better than those previously popular algorithms. Of course, in contrast to those simple ensembles, these state-of-the-art algorithms often have more sophisticated constructions, and hence their training procedures are more complex, too.
We also observe that the proposed SEHE algorithm performs obviously better than the HE algorithm on almost all data sets, showing the correctness of adopting selective ensemble strategy, again. We believe that in comparison to aggregating all base classifiers, adaptively selecting some of them would be helpful for searching for a better tradeoff between single performance and group diversity.
Furthermore, we are curious which combinations between sampling approaches and classification methods perform better in HE. On each data set, we scanned the average performance of F1-measure and G-mean throughout all combinations, and the best combination was recorded based on the feedback of 10 random runs' external five-fold cross validation. Further, the top 10 combinations are presented in Table 5. One could make the following conclusion from the results in Table 5: Combining one of those sophisticated sampling approaches, e.g., MWMOTE, MPBU, UFFDFR or GA-SMOTE, and one of those robust classification methods, e.g., Support Vector Machine or Extreme Learning Machine, it is easier to produce excellent classification performance.
Rank | Combination | # data sets performing best | |
1 | MPBU + Support Vector Machine | 7 | |
2 | UFFDFR + Logistic Regression | 5 | |
3 | MWMOTE + Extreme Learning Machine | 4 | |
4 | MWMOTE + Support Vector Machine | 3 | |
5 | UFFDER + CART | 3 | |
6 | GA-SMOTE + Support Vector Machine | 2 | |
7 | MPBU + KNN | 2 | |
8 | MWMOTE + Support Vector Machine | 2 | |
9 | GA-SMOTE + Extreme Learning Machine | 2 | |
10 | MPBU + Extreme Learning Machine | 2 |
Furthermore, we investigated an important issue, that is, how many base classifiers have been selected to organize the ensemble in our proposed SEHE algorithm? Figure 2 shows the average number of selected base classifiers in SEHE based on 50 independent runs on each data set. In Figure 2, we observed that on most data sets, SEHE integrates no more than 20 base classifiers to make final decision but produces robust results, which verifies the assumption that SEHE could adaptively find the trade-off between single quality and group diversity, again.
Next, the Nemenyi test [53,54] is used to observe whether there exist some actual differences between the proposed SEHE algorithm and the other comparative algorithms in statistics. Specifically, the critical difference (CD) metric is used to show the differences among various algorithms. Figure 3 shows the CD diagram at a standard level of significance, α = 0.05, where the average ranking of each algorithm is marked along the axis (higher rankings to the left). In a CD diagram, if a group of algorithms are not significantly different, then these algorithms will be connected by a thick line.
In Figure 3, we observed that our proposed SEHE algorithm has acquired the lowest average rankings 1.7976 on F1-measure and 2.3571 on G-mean, which show that it is the best one among all comparative algorithms on both metrics. At a standard level of significance, α = 0.05, SEHE significantly outperforms its competitors except ECO-Ensemble on F1-measure metric. Therefore, we have to say that on the F1-metric at least, in contrast to the ECO-Ensemble algorithm, our proposed SEHE algorithm has not presented a significant superiority. In addition, another conclusion can be drawn from the results in Figure 3, i.e., several state-of-the-art algorithms proposed in several recent years generally perform better than those previously popular algorithms.
To make clear how much improvement the proposed SEHE algorithm has acquired, we also compared it with the two following baselines: One is the performance yielded by the best base classifier in HE, and the other one is the performance produced by HE. The percentages of performance improvement compared with the two baselines in terms of F1-measure and G-mean metrics are respectively presented in Figure 4.
Figure 4 shows that on most data sets, the proposed SEHE algorithms can improve classification performance in comparison with two baselines to some extent, which indicates the necessity of adopting a selective ensemble, again. Meanwhile, we observed an interesting phenomenon, that is, on most data sets, the HE algorithm performs poorer than its best member (base classifier). This is acceptable, because in a practical application, it cannot foresee the best combination of sampling method and classification model. In contrast to directly integrating heterogeneous methods, the model selection is obviously a more complex task.
Finally, we designed two groups of ablation experiments to make clear the effect of adopting diverse sampling approaches and heterogeneous classifiers in SEHE, respectively. We tuned the ablation at the data level and classifier level, respectively. That is to say, for each test, we only reserve either one data-level or one classifier-level method in SEHE, whereas the other level remains stable. For example, if the ablation associates with SMOTE, then all other sampling algorithms would be removed, but the classifier level still maintains six heterogeneous classifiers. Considering that the ablation reduces the ensemble size and further causes an impartial comparison, we utilized the bootstrap technique before sampling to guarantee that a total number of 42 diverse base classifiers can be generated. A pairwise t-test at 5% significance level was adopted to compare SEHE with its ablation algorithm, and further the numbers of wins/ties/losses throughout all 42 data sets were recorded, respectively.
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_RUS | 15 | 26 | 1 | 14 | 25 | 3 |
SEHE vs. SEHE_UFFDFR | 13 | 28 | 1 | 14 | 28 | 0 |
SEHE vs. SEHE_MPBU | 11 | 29 | 2 | 15 | 26 | 1 |
SEHE vs. SEHE_ROS | 10 | 31 | 1 | 12 | 28 | 2 |
SEHE vs. SEHE_SMOTE | 7 | 33 | 2 | 9 | 29 | 4 |
SEHE vs. SEHE_MWMOTE | 6 | 34 | 2 | 10 | 29 | 3 |
SEHE vs. SEHE_GA-SMOTE | 6 | 32 | 4 | 8 | 33 | 1 |
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_CART | 17 | 25 | 0 | 19 | 22 | 1 |
SEHE vs. SEHE_KNN | 20 | 21 | 1 | 23 | 19 | 0 |
SEHE vs. SEHE_SVM | 15 | 22 | 5 | 17 | 22 | 3 |
SEHE vs. SEHE_ELM | 13 | 26 | 3 | 15 | 24 | 3 |
SEHE vs. SEHE_LR | 18 | 24 | 0 | 21 | 20 | 1 |
SEHE vs. SEHE_NB | 25 | 17 | 0 | 23 | 18 | 1 |
Tables 6 and 7 provide the results of ablation experiments at data level and classifier level, respectively. Specifically, each ablation algorithm was still conducted by 10 random runs' external five-fold cross validation. The results show that reducing diversity at data level influences less the classification performance than tuning the classifier level. In other words, the adoption of heterogeneous classifiers contributes significantly more for the success of our SEHE algorithm, which illustrates that the ensemble of heterogeneous classifiers is helpful for reducing the bias of a single classifier and enhancing group diversity of ensemble, again.
In this study, a selective evolutionary heterogeneous ensemble algorithm has been proposed to address the class imbalance learning problem. First, a group of different sampling approaches run on the original training data to acquire multiple diverse balanced training subsets. Then, each subset is trained on multiple different types of classifiers, further generating some diverse base classifiers. Next, a selective ensemble procedure relying on the evolutionary algorithm is conducted to search for the best combination of base classifiers. By lots of comparative experiments, the proposed SEHE algorithm shows a superior performance in comparison to some state-of-the-art class imbalance ensemble learning algorithms.
From the experimental results, several major conclusions could be safely drawn as follows:
1) Disturbing the data level and the classifier level both help to enhance ensemble diversity, but the disturbance at the classifier level contributes significantly more.
2) Many are really better than all: Adaptively selecting a few base classifiers always performs better than aggregating all.
3) The best base classifier often outperforms the HE, but in most cases, it performs poorer than the SEHE.
4) SEHE is a flexible, robust and competitive algorithm to solve the class imbalance learning problem in real-world applications.
In future work, the issue of how to decrease the time-complexity of the SEHE algorithm will be investigated. The performance variation of SEHE by extending the library of sampling and classification methods will be observed in future study, too. Additionally, we state that although in this study, the SEHE has not been verified on a multi-class imbalanced learning problem, it is indeed compatible for both binary and multi-class classification problems when and only when all embedded sampling algorithms are simultaneously appropriate for these two problems. Therefore, extending the current SEHE version to the multi-class classification problem for estimating its effectiveness and superiority will also be studied in future work.
This work was supported in part by the National Natural Science Foundation of China under Grant No. 62076215.
The authors declare there is no conflict of interest.
[1] | Carey MF, Peterson CL, Smale ST (2009) Transcription and preinitiation complex assembly in vitro, In: Carey, M.F., Peterson, C.L., Smale, S.T. Eds., Transcriptional Regulation in Eukaryotes, 2nd Ed, New York: Cold Spring Harbor Laboratory Press, 439-538. |
[2] |
Scarpulla RC, (2008) Transcriptional paradigms in mammalian mitochondrial biogenesis and function. Physiol Rev 88: 611-638. doi: 10.1152/physrev.00025.2007
![]() |
[3] |
Campbell CT, Kolesar JE, Kaufman BA, (2012) Mitochondrial transcription factor A regulates mitochondrial transcription initiation, DNA packaging, and genome copy number. Biochim Biophys Acta 1819:921-929. doi: 10.1016/j.bbagrm.2012.03.002
![]() |
[4] |
Vafai SB, Mootha VK, (2012) Mitochondrial disorders as windows into an ancient organelle. Nature 491: 374-383. doi: 10.1038/nature11707
![]() |
[5] |
Pagliarini DJ, Calvo SE, Chang B et al. (2008) A mitochondrial protein compendium elucidates complex I disease biology. Cell 134: 112-123. doi: 10.1016/j.cell.2008.06.016
![]() |
[6] |
Schmidt O, Pfanner N, Meisinger C, (2010) Mitochondrial protein import: from proteomics to functional mechanisms. Nat Rev Mol Cell Biol 11: 655-667. doi: 10.1038/nrm2959
![]() |
[7] |
Reinders J, Zahedi RP, Pfanner N et al. (2006) Toward the complete yeast mitochondrial proteome: multidimensional separation techniques for mitochondrial proteomics. J. Proteome Res 5: 1543-1554. doi: 10.1021/pr050477f
![]() |
[8] | Evans MJ, Scarpulla RC (1989) Interaction of nuclear factors with multiple sites in the somatic cytochrome c promoter. Characterization of upstream NRF-1, ATF and intron Sp1 recognition sites. J Biol Chem 264: 14361-14368. |
[9] |
Scarpulla RC (1997) Nuclear control of respiratory chain expression in mammalian cells. J Bioenerg Biomembr 29: 109-119. doi: 10.1023/A:1022681828846
![]() |
[10] |
Whatcott CJ, Meyer-Ficca ML, Meyer RG, et al. (2009) A specific isoform of poly(ADP-ribose) glycohydrolase is targeted to the mitochondrial matrix by a N-terminal mitochondrial targeting sequence. Exp Cell Res 315: 3477-3485. doi: 10.1016/j.yexcr.2009.04.005
![]() |
[11] |
Meyer RG, Meyer-Ficca ML, Jacobson EL, Jacobson MK (2003) Human poly(ADP-ribose) glycohydrolase (PARG) gene and the common promoter sequence it shares with inner mitochondrial membrane translocase 23 (TIM23). Gene 314: 181-190. doi: 10.1016/S0378-1119(03)00738-8
![]() |
[12] | Uchiumi F, Sakakibara G, Sato J, et al. (2008) Characterization of the promoter region of the human PARG gene and its response to PU.1 during differentiation of HL-60 cells. Genes Cells 13: 1229-1248. |
[13] |
Uchiumi F, Enokida K, Shiraishi T, et al. (2010) Characterization of the promoter region of the human IGHMBP2 (Smbp-2) gene and its response to TPA in HL-60 cells. Gene 463: 8-17. doi: 10.1016/j.gene.2010.04.014
![]() |
[14] |
Koc EC, Burkhart W, Blackburn K, et al. (2001) Large subunit of the mammalian mitochondrial ribosome. J Biol Chem 276: 43958-43969. doi: 10.1074/jbc.M106510200
![]() |
[15] | Uchiumi F, Larsen S, Tanuma S (2013) Biological systems that control transcription of DNA repair-and telomere maintenance-associated genes, In: Chen, C. Ed., DNA repair, Rijeka, Croatia, InTech Open Access Publisher, 309-325. |
[16] |
Matoba S, Kang JG, Patino WD, et al. (2006) p53 regulates mitochondrial respiration. Science 312: 1650-1653. doi: 10.1126/science.1126863
![]() |
[17] |
Green DR, Kroemer G (2009) Cytoplasmic functions of the tumor suppressor p53. Nature 458: 1127-1130. doi: 10.1038/nature07986
![]() |
[18] | Sahin E, DePinho RA (2012) Axis of aging: telomeres, p53 and mitochondria. Nat Rev Mol Cell Biol 13, 397-404. |
[19] |
Szczesny B, Tann AW, Longley MJ, et al. (2008) Long patch base excision repair in mammalian mitochondrial genomes. J Biol Chem 283: 26349-26356. doi: 10.1074/jbc.M803491200
![]() |
[20] | Martin SA (2011) Mitochondrial DNA repair, In: Storici, F. Ed., DNA Repair - On the Pathways to Fixing DNA Damage and Errors, Rijeka, Croatia: InTech-Open Access Publisher, 313-338. |
[21] |
Sementchenko VI, Watson DK (2000) Ets target genes: past, present and future. Oncogene 19: 6533-6548. doi: 10.1038/sj.onc.1204034
![]() |
[22] |
Baillat D, Laitem C, Leprivier G, et al. (2009) Ets-1 binds cooperatively to the palindromic Ets-binding sites in the p53 promoter. Biochem Biophys Res Commun 378: 213-217. doi: 10.1016/j.bbrc.2008.11.035
![]() |
[23] |
Molloy-Simard V, St-Laurent JF, Vigneault F, et al. (2012) Altered expression of the poly(ADP-ribosyl)ation enzymes in uveal melanoma and regulation of PARG gene expression by the transcription factor ERM. Invest Ophthamol Vis Sci 53: 6219-6231. doi: 10.1167/iovs.11-8853
![]() |
[24] |
Uchiumi F, Watanabe T, Tanuma S (2010) Characterization of various promoter regions of the human DNA helicase-encoding genes and identification of duplicated ets (GGAA) motifs as an essential transcription regulatory element. Exp Cell Res 316: 1523-1534. doi: 10.1016/j.yexcr.2010.03.009
![]() |
[25] |
Uchiumi F, Miyazaki S, Tanuma S (2011) The possible functions of duplicated ets (GGAA) motifs located near transcription start sites of various genes. Cell Mol Life Sci 68: 2039-2051. doi: 10.1007/s00018-011-0674-x
![]() |
[26] |
Smeitink JA, Loeffen JL, Triepels RH, et al. (1998) Nuclear genes of human complex I of the mitochondrial electron transport chain: state of the art. Hum Mol Genet 7: 1573-1579. doi: 10.1093/hmg/7.10.1573
![]() |
[27] |
Starker LF, Delgado-Verdugo A, Udelsman R, et al. (2010) Expression and somatic mutations of SDHAF2 (SDH5), a novel endocrine tumor suppressor gene in parathyroid tumors of primary hyperparathyroidism. Endocrine 38: 397-401. doi: 10.1007/s12020-010-9399-0
![]() |
[28] |
Harris KA, Jones V, Bilbille Y, et al. (2011) YrdC exhibits properties expected of a subunit for a tRNA threonylcarbamoyl transferase. RNA 17: 1678-1687. doi: 10.1261/rna.2592411
![]() |
[29] |
Yang C, Bolotin E, Jiang T, et al. (2007) Prevalence of the initiator over the TATA box in human and yeast genes and identification of DNA motifs enriched in human TATA-less core promoters. Gene 389: 52-65. doi: 10.1016/j.gene.2006.09.029
![]() |
[30] | Trinklein ND, Aldred SF, Hartman SJ, et al. (2004) An aboundance of bidirectional promoters in the human genome. Genome Res 14: 62-66. |
[31] | Yang MQ, Elnitski LL (2008) Diversity of core promoter elements comprising human bidirectional promoters. BMC Genomics 9 (Suppl. 2): S3. |
[32] |
Collins PJ, Kobayashi Y, Nguyen L, et al. (2007) The ets-related transcription factor GABP directs bidirectional transcription. PLoS Genet 3: e208. doi: 10.1371/journal.pgen.0030208
![]() |
[33] |
Lin JM, Collins PJ, Trinklein ND, et al. (2007) Transcription factor binding and modified histones in human bidirectional promoters. Genome Res 17: 818-827. doi: 10.1101/gr.5623407
![]() |
[34] |
Anno YN, Myslinski E, Ngondo-Mbongo RP, et al. (2011) Genome-wide evidence for an essential role of the human Staf/ZNF143 transcription factor in bidirectional transcription. Nucleic Acids Res 39: 3116-3127. doi: 10.1093/nar/gkq1301
![]() |
[35] |
Orekhova AS, Rubtsov PM (2013) Bidirectional promoters in the transcription of mammalian genomes. Biochem (Moscow) 78: 335-341. doi: 10.1134/S0006297913040020
![]() |
[36] |
Adachi N, Lieber MR (2002) Bidirctional gene organization: a common architectural feature of the human genome. Cell 109: 807-809. doi: 10.1016/S0092-8674(02)00758-4
![]() |
[37] |
Wakano C, Byun JS, Di LJ, et al. (2012) The dual lives of bidirectional promoters. Biochim Biophys Acta 1819: 688-693. doi: 10.1016/j.bbagrm.2012.02.006
![]() |
[38] | Xu C, Chen J, Shen B (2012) The preservation of bidirectional promoter architecture in eukaryotes: what is the driving force? BMC Systems Biol 6 (Suppl. 1): S21. |
[39] | Uchiumi F, Larsen S, Masumi A, et al. (2013) The putative implications of duplicated GGAA-motifs located in the human interferon regulated genes (ISGs), In: iConcept Ed., Genomics I- Humans, Animals and Plants, Hong Kong: iConcept Press, 87-105. |
[40] |
Gotea V, Petrykowska HM, Elnitski L (2013) Bidirectional promoters as important drivers for the emergence of species-specific transcripts. PLoS One 8: e57323. doi: 10.1371/journal.pone.0057323
![]() |
[41] | Yang MQ, Koehly LM, Elinitski LL (2007) Comprehensive annotation of bidirectional promoters identifies co-regulation among breast and ovarian cancer genes. PLoS Comput Biol 3, e72. |
[42] |
Lin R, Paz S, Hiscott J (2010) Tom70 imports antiviral immunity to the mitochondria. Cell Res 20: 971-973. doi: 10.1038/cr.2010.113
![]() |
[43] |
Liu XY, Wei B, Shi HX, et al. (2010) Tom70 mediates activation of interferon regulatory factor 3 on mitochondria. Cell Res 20: 994-1011. doi: 10.1038/cr.2010.103
![]() |
[44] |
Kim SW, Kim JB, Kim JH, et al. (2007) Interferon-gamma-induced expressions of heat shock protein 60 and heat shock protein 10 in C6 astroglioma cells: identification of the signal transducers and activators of transcription 3-binding site in bidirectional promoter. Neuroreport 18: 385-389. doi: 10.1097/WNR.0b013e32801299cc
![]() |
[45] |
Mauxion F, Chen CY, Séraphin B, et al. (2009) BTG/TOB factors impact deadenylases. Trends Biochem Sci 34: 640-647. doi: 10.1016/j.tibs.2009.07.008
![]() |
[46] |
Koshiba T (2013) Mitochondrial-mediated antiviral immunity. Biochim Biophys Acta 1833: 225-232. doi: 10.1016/j.bbamcr.2012.03.005
![]() |
[47] |
Okazaki T, Higuchi M, Gotoh Y (2013) Mitochondrial localization of the antiviral signaling adaptor IPS-1 is important for its induction of caspase activation. Genes Cells 18: 493-501. doi: 10.1111/gtc.12052
![]() |
[48] | Wei GH, Badis G, Berger MF, et al. Genome-wide analysis of ETS-family DNA-binding in vitro and in vivo. EMBO J 29: 2147-2160. |
[49] |
Chen YQ, Sengchanthalangsy LL, Hackett A, et al. (2000) F-kappaB p65 (RelA) homodimer uses distinct mechanisms to recognize DNA targets. Structure 8: 419-428. doi: 10.1016/S0969-2126(00)00123-4
![]() |
[50] |
Wei L, Fan M, Xu L, et al. (2008) Bioinformatic analysis reveals cRel as a regulator of a subset of interferon-stimurated genes. J Interferon Cytokine Res 28: 541-552. doi: 10.1089/jir.2007.0136
![]() |
[51] |
Nguyen VT, Benveniste EN (2000) Involvement of STAT-1 and Ets family members in interferon-g induction of CD40 transcription in microglia/macrophages. J Biol Chem 275: 23674-23684. doi: 10.1074/jbc.M002482200
![]() |
[52] | Aittomäki S, Yang J, Scott EW, et al. (2004) Molecular basis of Stat1 and PU.1 cooperation in cytokine-induced Fcgamma receptor 1 promoter activation. Int Immunol 16: 265-274. |
[53] | Meraro D, Gleit-Kielmanowicz M, Hauser H, et al. (2002) IFN-stimulated gene 15 is synergistically activated through interactions between the myelocyte/lymphocyte-specific transcription factors, PU.1, IFN regulatory factor-8/IFN consensus sequence binding protein, and IFN regulatory factor-4: characterization of a new subtype of IFN-stimulated response element. J Immunol 168: 6224-6231. |
[54] | Kroeger PE, Morimoto RI (1994) Selection of new HSF1 and HSF2 DNA-binding sites reveals differences in tumor cooperativity. Mol Cell Biol 14: 7592-7603. |
[55] | Wang G, Qi K, Zhao Y, et al. (2013) Identification of regulatory regions of bidirectional genes in cervical cancer. BMC Med Genet 6 (Suppl. 1): S5. |
[56] |
Bachman NJ, Wu W, Schmidt TR, et al. (1999) The 5' region of the COX4 gene contains a novel overlapping gene, NOC4. Mamm Genome 10: 506-512. doi: 10.1007/s003359901031
![]() |
[57] | FitzGerald PC, Shlyakhtenko A, Mir AA, et al. Clustering of DNA sequences in human promoters. Genome Res 14: 1562-1574. |
[58] | Bellizzi D, Dato S, Cavalcante P, et al. (2005) Characterization of a bidirectional promoter shared between two human genes related to aging: SIRT3 and PSMD13. Genomics 89: 143-150. |
[59] | Patton J, Block S, Coombs C, et al. (2005) Identification of functional elements in the murine Gabp alpha/ATP synthase coupling factor 6 bi-directional promoter. Gene 369: 35-44. |
[60] |
Gaston K, Fried M (1994) YY1 is involved in the regulation of the bi-directional promoter of the Surf-1 and Surf-2 genes. FEBS Lett 347: 289-294. doi: 10.1016/0014-5793(94)00567-2
![]() |
[61] |
Orii KE, Orii KO, Souri M, et al. (1999) Genes for the human mitochondrial trifunctional protein a- and b-subunits are divergently transcribed from a common promoter region. J Biol Chem 274: 8077-8084. doi: 10.1074/jbc.274.12.8077
![]() |
[62] |
Zanotto E, Häkkinen A, Teku G, et al. (2009) NF-Y influences directionality of transcription from the bidirectional Mrps12/Sarsm promoter in both mouse and human cells. Biochim Biophys Acta 1789: 432-442. doi: 10.1016/j.bbagrm.2009.05.001
![]() |
[63] | Merika M, Thanos D (2011) Enhanceosomes. Curr Opin Genet Dev 11: 205-208. |
[64] |
Sahin E, Colla S, Liesa M, et al. (2011) Telomere dysfunction induces metabolic and mitochondrial compromise. Nature 470: 359-365. doi: 10.1038/nature09787
![]() |
[65] | O'Sullivan RJ, Karlseder J (2010) Telomeres: protecting chromosomes against genome instability. Nat Rev Mol Cell Biol 11: 171-181. |
[66] |
Yu BP, Chung HY (2006) Adaptive mechanisms to oxidative stress during aging. Mech Ageing Dev 127: 436-443. doi: 10.1016/j.mad.2006.01.023
![]() |
[67] |
Wolff S, Dillin A (2013) Beneficial miscommunication. Nature 497: 442-443. doi: 10.1038/497442a
![]() |
[68] |
Houtkooper RH, Mouchiroud L, Ryu D, et al. (2013) Mitochondrial protein imbalance as a conserved longevity mechanism. Nature 497: 451-457. doi: 10.1038/nature12188
![]() |
[69] |
Schmacher B (2009) Transcription-blocking DNA damage in aging: a mechanism for hormesis. Bioessays 31: 1347-1356. doi: 10.1002/bies.200900107
![]() |
[70] |
Calabrese EJ, Mattson MP, Calabrase V (2010) Resveratrol commonly displays hormesis: Occurrence and biomedical significance. Hum Exp Toxicol 29: 980-1015. doi: 10.1177/0960327110383625
![]() |
[71] |
Howitz KT, Bitterman KJ, Cohen HY, et al. (2003) Small molecule activators of sirtuins extend Saccharomyces cerevisiae life span. Nature 425: 191-196. doi: 10.1038/nature01960
![]() |
[72] |
Wood JG, Rogina B, Lavul S, et al. (2004) Sirtuin activators mimic caloric restriction and delay aging in metazoans. Nature 430: 686-689. doi: 10.1038/nature02789
![]() |
[73] |
Stefani M, Markus MA, Lin RC, et al. (2007) The effect of resveratrol on a cell model of human aging. Ann NY Acad Sci 1114: 407-418. doi: 10.1196/annals.1396.001
![]() |
[74] | Kroemer G (1997) Mitochondrial implication in apoptosis. Towards an endosymbiont hypothesis of apoptosis evolution. Cell Death Differ 4: 443-546. |
[75] |
Tait SW, Green DR (2010) Mitochondria and cell death: outer membrane permeabilization and beyond. Nat Rev Mol Cell Biol 11: 621-632. doi: 10.1038/nrm2952
![]() |
[76] | Estaquier J, Vallette F, Vayssiere JL, et al. (2012) The mitochondrial pathways of apoptosis, In: Scatena, R., Bottoni, P., Giardina B. Eds., Advances in Mitochondrial Medicine, Dordrecht, Germany: Springer Science+Business Media B.V., 157-183. |
[77] |
Vandenabeele P, Galluzzi L, Berghe TV (2010) Molecular mechanisms of necroptosis: an ordered cellular explosion. Nat Rev Mol Cell Biol 11: 700-714. doi: 10.1038/nrm2970
![]() |
[78] |
Galluzzi L, Kepp O, Kroemer G (2012) Mitochondria: master regulators of danger signaling. Nat Rev Mol Cell Biol 13: 780-788. doi: 10.1038/nrm3479
![]() |
[79] |
Buchakjian MR, Kornbluth S (2010) The engine driving the ship: metabolic steering of cell proliferation and death. Nat Rev Mol Cell Biol 11: 715-727. doi: 10.1038/nrm2972
![]() |
[80] |
Gut P (2013) The nexus of chromatin regulation and intermediary metabolism. Nature 502: 489-498. doi: 10.1038/nature12752
![]() |
[81] |
Pearce EL, Poffenberger MC, Chang CH, et al. (2013) Fueling Immunity: Insights into metabolism and lymphocyte function. Science 342: 1242454. doi: 10.1126/science.1242454
![]() |
[82] |
Tan-Wong SM, Zaugg JB, Camblong J, et al. (2012) Gene loops enhance transcriptional directionality. Science 338: 671-675. doi: 10.1126/science.1224350
![]() |
[83] |
Ansari A, Hampsey M (2005) A role for the CPF 3'-end processing machinery in RNAP II-dependent gene looping. Genes Dev 19: 2969-2978. doi: 10.1101/gad.1362305
![]() |
[84] |
Hampsey M (2012) A new direction for gene loops. Science 338: 624-625. doi: 10.1126/science.1230576
![]() |
[85] |
Lykke-Andersen S, Mapendano CK, Jensen TH (2011) An ending is a new beginning. Cell Cycle 10: 863-865. doi: 10.4161/cc.10.6.14931
![]() |
[86] |
Perkins KJ, Lusic M, Mitar I, et al. (2008) Transcription-dependent gene looping of the HIV-1 provirus is dictated by recognition of pre-mRNA processing signals. Mol Cell 29: 56-68. doi: 10.1016/j.molcel.2007.11.030
![]() |
[87] |
Barrett LW, Fletcher S, Wilton SD (2012) Regulation of eukaryotic gene expression by the untranslated gene regions and other non-coding elements. Cell Mol Life Sci 69: 3613-3634. doi: 10.1007/s00018-012-0990-9
![]() |
[88] |
McCue AD, Nuthikattu S, Reeder S.H, et al. (2012) Gene expression and stress response mediated by the epigenetic regulation of a transposable element small RNA. PLoS Genet 8: e1002474. doi: 10.1371/journal.pgen.1002474
![]() |
[89] |
Kapitonov VV, Jurka J (2005) RAG1 core and V(D)J recombination signal sequences were derived from Transib transposons. PLoS Biol 3: 181. doi: 10.1371/journal.pbio.0030181
![]() |
[90] |
Acquaviva L, Székvölgyi L, Dichtl B, et al. (2013) The COMPASS subunit Spp1 links histone methylation to initiation of meiotic recombination. Science 339: 215-218. doi: 10.1126/science.1225739
![]() |
[91] |
Core LT, Waterfall JJ, Lis JT (2008) Nascent RNA sequencing reveals widespread pausing and divergent initiation at human promoters. Science 322: 1845-1848. doi: 10.1126/science.1162228
![]() |
[92] |
Seila AC, Calabrese JM, Levine SS, et al. (2008) Divergent transcription from active promoters. Science 322: 1849-1851. doi: 10.1126/science.1162253
![]() |
[93] |
Preker P, Nielsen J, Kammler S, et al. (2008) RNA exosome depletion reveals transcription upstream of active human promoters. Science 322: 1851-1854. doi: 10.1126/science.1164096
![]() |
[94] |
Almada AE, Wu X, Kriz AJ (2013) Promoter directionality is controlled by U1 snRNP and polyadenylation signals. Nature 499: 360-363. doi: 10.1038/nature12349
![]() |
[95] |
Ntini E, Järvelin AI, Bornholdt J et al. (2013) Polyadenylation site-induced decay of upstream transcripts enforces promoter directionality. Nat Struct Mol Biol 20: 923-928. doi: 10.1038/nsmb.2640
![]() |
[96] |
Zhang L, Ding Q, Wang P, et al. (2013) An upstream promoter element blocks the reverse transcription of the mouse insulin-degrading enzyme gene. Biochem Biophys Res Commun 430: 26-31. doi: 10.1016/j.bbrc.2012.11.052
![]() |
[97] |
Szklarczyk R, Huynen MA (2010) Mosaic origin of the mitochondrial proteome. Proteomics 10: 4012-4024. doi: 10.1002/pmic.201000329
![]() |
[98] |
Yang D, Oyaizu Y, Oyaizu H, et al. (1985) Mitochondrial origins. Proc Natl Acad Sci USA 82: 4443-4447. doi: 10.1073/pnas.82.13.4443
![]() |
[99] |
Viale AM, Arakaki AK (1994) The chaperone connection to the origins of the eukaryotic organelles. FEBS Lett 341: 146-151. doi: 10.1016/0014-5793(94)80446-X
![]() |
[100] |
Andersson SG, Zomorodipour A, Andersson JO, et al. (1998) The genome sequence of Rickettsia prowazekii and the origin of mitochondria. Nature 396: 133-143. doi: 10.1038/24094
![]() |
[101] |
Dunning Hotopp JC (2011) Horizontal gene transfer between bacteria and animals. Trends Genet 27: 157-163. doi: 10.1016/j.tig.2011.01.005
![]() |
[102] |
Schönknecht G, Chen WH, Ternes CM, et al. (2013) Gene transfer from bacteria and archaea facilitated evolution of an extremophilic eukaryote. Science 339: 1207-1210. doi: 10.1126/science.1231707
![]() |
[103] | Martin WF (2013) Early evolution without a tree of life. Biol Direct 6: 36. |
[104] |
Rocha EPC (2013) With a little help from prokaryotes. Science 339: 1154-1155. doi: 10.1126/science.1234938
![]() |
[105] |
ENCODE Project Consortium (2012) An integrated encyclopedia of DNA elements in the human genome. Nature 489: 57-74. doi: 10.1038/nature11247
![]() |
[106] | Scatena R (2012) Mitochondria and cancer, In: Scatena, R., Bottoni, P., Giardina B. Eds., Advances in Mitochondrial Medicine, Dordrecht, Germany: Springer Science+Business Media B.V., 287-308. |
[107] |
Schultze A, Harris AL (2012) How cancer metabolism is turned for proliferation and vulnerable to disruption. Nature 491: 364-373. doi: 10.1038/nature11706
![]() |
[108] | Camara AK, Bienengraeber M, Stowe DF (2011) Mitochondrial approaches to protect against cardiac ischemia and reperfusion injury. Front Physiol 2: 1-34. |
[109] | Griffiths EJ (2012) Mitochondria and Heart Disease, In: Scatena, R., Bottoni, P., Giardina B. Eds., Advances in Mitochondrial Medicine, Dordrecht, Germany: Springer Science Business Media B.V., 249-267. |
[110] | Swerdlow LE, Swerdlow RH (2012) Mitochondria in nerodegeneration, In: Scatena, R., Bottoni, P., Giardina B. Eds., Advances in Mitochondrial Medicine, Dordrecht, Germany: Springer Science Business Media B.V., 269-286. |
[111] |
Petit A, Kawarai T, Paitel E, et al. (2005) Wild-type PINK1 prevents basal and induced neuronal apoptosis, a protective effect abrogated by Parkinson-disease-related mutations. J Biol Chem 280: 34025-34032. doi: 10.1074/jbc.M505143200
![]() |
[112] |
Dagda RK, Chu CT (2009) Mitochondrial quality control: insights on how Parkinson's disease related genes PINK1, parkin, and Omi/HtrA2 interact to maintain mitochondrial homeostasis. J Bioenerg Biomembr 41: 473-479. doi: 10.1007/s10863-009-9255-1
![]() |
[113] |
Hertz NT, Berthet A, Sos ML, et al. (2013) A neo-substrate that amplifies catalytic activity of Parkinson's-disease-related kinase PINK1. Cell 154: 737-747. doi: 10.1016/j.cell.2013.07.030
![]() |
[114] |
Courchet J, Lewis TL, Lee S, et al. (2013) Terminal axon branching is regulated by the LKB1-NUAK1 kinase pathway via presynaptic mitochondrial capture. Cell 153: 1510-1525. doi: 10.1016/j.cell.2013.05.021
![]() |
1. | Qi Dai, Long-hui Wang, Kai-long Xu, Tony Du, Li-fang Chen, Class-overlap detection based on heterogeneous clustering ensemble for multi-class imbalance problem, 2024, 255, 09574174, 124558, 10.1016/j.eswa.2024.124558 | |
2. | Qianpeng Xiao, Changbin Shao, Sen Xu, Xibei Yang, Hualong Yu, CCkEL: Compensation-based correlated k-labelsets for classifying imbalanced multi-label data, 2024, 32, 2688-1594, 3038, 10.3934/era.2024139 | |
3. | Qi Dai, Xu Zhou, Jia-peng Yang, Tony Du, Li-fang Chen, A mutually supervised heterogeneous selective ensemble learning framework based on matrix decomposition for class imbalance problem, 2025, 271, 09574174, 126728, 10.1016/j.eswa.2025.126728 |
sampling approach | classification method | ||
s1 | RUS [9] | k1 | decision tree (CART) |
s2 | UFFDFR [12] | k2 | K-nearest neighbors (KNN) |
s3 | MPBU [13] | k3 | support vector machine (SVM) with Gaussian kernel |
s4 | ROS [9] | k4 | extreme learning machine (ELM) with sigmoid activation function |
s5 | SMOTE [11] | k5 | logistic regression (LR) |
s6 | MWMOTE [8] | k6 | naïve Bayes classifier (NB) |
s7 | GA-SMOTE [16] |
Dataset | #Instances | #Features | IR | Dataset | #Instances | #Features | IR |
glass1 | 214 | 9 | 1.82 | ecoli-0-6-7_vs_5 | 220 | 6 | 10.00 |
wisconsin | 683 | 9 | 1.86 | led7digit-0-2-4-5-6-7-8-9_vs_1 | 443 | 7 | 10.97 |
pima | 768 | 8 | 1.87 | ecoli-0-1_vs_5 | 240 | 6 | 11.00 |
haberman | 306 | 3 | 2.78 | shuttle-6_vs_2-3 | 230 | 9 | 22.00 |
vehicle1 | 846 | 18 | 2.90 | flare-F | 1066 | 11 | 23.79 |
new-thyroid1 | 215 | 5 | 5.14 | winequality-red-4 | 1599 | 11 | 29.17 |
yeast3 | 1484 | 8 | 8.10 | shuttle-2_vs_5 | 3316 | 9 | 66.67 |
ecoli3 | 336 | 7 | 8.60 | poker-8-9_vs_5 | 2075 | 10 | 82.00 |
vowel0 | 988 | 13 | 9.98 | poker-8_vs_6 | 1477 | 10 | 85.88 |
yeast-1_vs_7 | 459 | 7 | 14.30 | banknote | 1372 | 4 | 1.25 |
ecoli4 | 336 | 7 | 15.80 | ctgC5 | 2126 | 21 | 28.50 |
abalone9-18 | 731 | 8 | 16.40 | ctgN1vsN3 | 2126 | 21 | 3.50 |
shuttle-c2-vs-c4 | 129 | 9 | 20.90 | ctgN2 | 2126 | 21 | 6.20 |
yeast4 | 1484 | 8 | 28.10 | ctgN3 | 2126 | 21 | 11.10 |
yeast5 | 1484 | 8 | 32.73 | wilt | 4839 | 5 | 17.50 |
abalone19 | 4174 | 8 | 129.44 | mfeatmor01 | 2000 | 6 | 4.00 |
ecoli-0-3-4_vs_5 | 200 | 7 | 9.00 | mfeatmor012 | 2000 | 6 | 2.30 |
ecoli-0-6-7_vs_3-5 | 222 | 7 | 9.09 | seeds2v13 | 210 | 7 | 2.00 |
yeast-0-3-5-9_vs_7-8 | 506 | 8 | 9.12 | segment1 | 2310 | 19 | 6.00 |
yeast-0-3-5-9_vs_7-8 | 1004 | 8 | 9.14 | segment12 | 2310 | 19 | 2.50 |
ecoli-0-1_vs_2-3-5 | 244 | 7 | 9.17 | segment123 | 2310 | 19 | 1.30 |
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.6179 | 0.6418 | 0.6566 | 0.6531 | 0.6479 | 0.6710 | 0.7022 | 0.6937 | 0.6819 | 0.7074 | 0.7196 | 0.7338 | 0.7292 | 0.7231 | 0.7479 |
wisconsin | 0.9496 | 0.9501 | 0.9532 | 0.9458 | 0.9526 | 0.9531 | 0.9368 | 0.9444 | 0.9496 | 0.9537 | 0.9486 | 0.9529 | 0.9501 | 0.9529 | 0.9685 |
pima | 0.6074 | 0.6291 | 0.6399 | 0.6518 | 0.6436 | 0.6457 | 0.6706 | 0.6619 | 0.6538 | 0.6560 | 0.6489 | 0.6917 | 0.6742 | 0.6721 | 0.6898 |
haberman | 0.4310 | 0.4036 | 0.4117 | 0.4425 | 0.4328 | 0.4679 | 0.4658 | 0.4631 | 0.4254 | 0.4738 | 0.4699 | 0.4852 | 0.4798 | 0.4881 | 0.5122 |
vehicle1 | 0.6829 | 0.6473 | 0.6428 | 0.6716 | 0.6632 | 0.6811 | 0.6793 | 0.6627 | 0.6519 | 0.6521 | 0.7044 | 0.6759 | 0.6792 | 0.6721 | 0.7022 |
new-thyroid1 | 0.9275 | 0.9310 | 0.9288 | 0.9219 | 0.9468 | 0.9390 | 0.9426 | 0.9410 | 0.9272 | 0.9587 | 0.9446 | 0.9510 | 0.9451 | 0.9439 | 0.9517 |
yeast3 | 0.6866 | 0.6922 | 0.7014 | 0.7250 | 0.7033 | 0.7315 | 0.7446 | 0.7361 | 0.7410 | 0.7398 | 0.7316 | 0.7425 | 0.7355 | 0.7398 | 0.7461 |
ecoli3 | 0.5632 | 0.5791 | 0.6318 | 0.6007 | 0.6101 | 0.6049 | 0.6251 | 0.5735 | 0.5816 | 0.5835 | 0.5961 | 0.6032 | 0.6397 | 0.6381 | 0.6441 |
vowel0 | 0.9489 | 0.9876 | 0.9941 | 0.9774 | 0.9986 | 0.9826 | 0.9978 | 0.9598 | 0.9689 | 0.9936 | 0.9972 | 1.0000 | 0.9983 | 0.9973 | 1.0000 |
yeast-1_vs_7 | 0.2816 | 0.2459 | 0.4109 | 0.3828 | 0.4350 | 0.3744 | 0.3610 | 0.2969 | 0.2871 | 0.3980 | 0.3792 | 0.4160 | 0.3967 | 0.4159 | 0.4287 |
ecoli4 | 0.6491 | 0.7387 | 0.7519 | 0.6393 | 0.7275 | 0.7080 | 0.7441 | 0.6146 | 0.7269 | 0.7484 | 0.7175 | 0.7290 | 0.7448 | 0.7641 | 0.7548 |
abalone9-18 | 0.3353 | 0.3798 | 0.4314 | 0.3722 | 0.3918 | 0.3816 | 0.4277 | 0.2934 | 0.2987 | 0.3970 | 0.4052 | 0.4329 | 0.4320 | 0.4388 | 0.4530 |
shuttle-c2-vs-c4 | 0.9964 | 0.9981 | 1.0000 | 0.9981 | 1.0000 | 0.9993 | 0.9976 | 0.9952 | 0.9971 | 0.9994 | 1.0000 | 0.9972 | 0.9991 | 1.0000 | 1.0000 |
yeast4 | 0.2798 | 0.3362 | 0.3517 | 0.3394 | 0.3707 | 0.3239 | 0.3949 | 0.2875 | 0.2987 | 0.3708 | 0.3646 | 0.3802 | 0.3569 | 0.3694 | 0.3743 |
yeast5 | 0.5967 | 0.6652 | 0.6743 | 0.5969 | 0.6837 | 0.6345 | 0.6874 | 0.5350 | 0.5989 | 0.6947 | 0.6528 | 0.6777 | 0.6931 | 0.6895 | 0.6933 |
abalone19 | 0.0372 | 0.0991 | 0.0869 | 0.0417 | 0.1132 | 0.0507 | 0.0962 | 0.0487 | 0.0396 | 0.1133 | 0.1521 | 0.1312 | 0.1479 | 0.1455 | 0.1752 |
ecoli-0-3-4_vs_5 | 0.7109 | 0.7983 | 0.8109 | 0.7329 | 0.8086 | 0.7751 | 0.8108 | 0.6954 | 0.7374 | 0.7930 | 0.8028 | 0.8039 | 0.8184 | 0.7999 | 0.8481 |
ecoli-0-6-7_vs_3-5 | 0.5819 | 0.6343 | 0.6420 | 0.5927 | 0.6588 | 0.6447 | 0.6736 | 0.5210 | 0.6001 | 0.6542 | 0.6571 | 0.6848 | 0.6352 | 0.6572 | 0.6791 |
yeast-0-3-5-9_vs_7-8 | 0.2933 | 0.3231 | 0.3490 | 0.2815 | 0.3140 | 0.3392 | 0.3431 | 0.3089 | 0.3076 | 0.3418 | 0.3396 | 0.3210 | 0.3317 | 0.3396 | 0.3991 |
yeast-0-3-5-9_vs_7-8 | 0.6956 | 0.7722 | 0.7630 | 0.6717 | 0.7448 | 0.7134 | 0.7520 | 0.6837 | 0.7474 | 0.7541 | 0.7498 | 0.7526 | 0.7279 | 0.7579 | 0.7890 |
ecoli-0-1_vs_2-3-5 | 0.5549 | 0.6627 | 0.6788 | 0.5610 | 0.6472 | 0.6081 | 0.6654 | 0.6344 | 0.6290 | 0.6708 | 0.6587 | 0.6842 | 0.6738 | 0.6688 | 0.7223 |
ecoli-0-6-7_vs_5 | 0.5345 | 0.6860 | 0.7438 | 0.6633 | 0.7297 | 0.7515 | 0.7337 | 0.6125 | 0.7120 | 0.6995 | 0.7498 | 0.7355 | 0.7378 | 0.7394 | 0.7511 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.6627 | 0.7413 | 0.7427 | 0.6580 | 0.7336 | 0.7189 | 0.7355 | 0.6528 | 0.7066 | 0.7209 | 0.7515 | 0.7320 | 0.7458 | 0.7468 | 0.7515 |
ecoli-0-1_vs_5 | 0.7440 | 0.7752 | 0.7836 | 0.7506 | 0.7878 | 0.7441 | 0.7642 | 0.6784 | 0.7043 | 0.7531 | 0.7752 | 0.7807 | 0.7735 | 0.7742 | 0.7835 |
shuttle-6_vs_2-3 | 0.8929 | 0.7304 | 0.7543 | 0.8891 | 0.7984 | 0.8894 | 0.8906 | 0.8814 | 0.8839 | 0.8539 | 0.8901 | 0.8872 | 0.8756 | 0.8795 | 0.9194 |
flare-F | 0.1909 | 0.2542 | 0.2798 | 0.2590 | 0.2711 | 0.2433 | 0.2677 | 0.2516 | 0.2702 | 0.2710 | 0.2688 | 0.2854 | 0.2860 | 0.2991 | 0.3796 |
winequality-red-4 | 0.1277 | 0.1792 | 0.2098 | 0.1744 | 0.1811 | 0.1795 | 0.1811 | 0.1342 | 0.0970 | 0.1979 | 0.1594 | 0.1945 | 0.1986 | 0.2333 | 0.2691 |
shuttle-2_vs_5 | 1.0000 | 0.9951 | 1.0000 | 0.9729 | 0.9982 | 1.0000 | 1.0000 | 0.9684 | 0.9797 | 0.9968 | 0.9981 | 1.0000 | 0.9952 | 0.9954 | 1.0000 |
poker-8-9_vs_5 | 0.0372 | 0.0661 | 0.0971 | 0.0422 | 0.1157 | 0.0679 | 0.1011 | 0.0385 | 0.0406 | 0.0920 | 0.1008 | 0.1106 | 0.1033 | 0.1176 | 0.1795 |
poker-8_vs_6 | 0.2906 | 0.6790 | 0.6929 | 0.2409 | 0.6508 | 0.4411 | 0.4076 | 0.1919 | 0.2143 | 0.5080 | 0.6945 | 0.6991 | 0.6876 | 0.6734 | 0.7394 |
banknote | 0.9828 | 0.9816 | 0.9830 | 0.9901 | 0.9749 | 0.9826 | 0.9724 | 0.9849 | 0.9751 | 0.9810 | 0.9886 | 0.9815 | 0.9843 | 0.9799 | 0.9896 |
ctgC5 | 0.2820 | 0.3144 | 0.3462 | 0.2517 | 0.3727 | 0.2342 | 0.3109 | 0.3519 | 0.3442 | 0.3031 | 0.3572 | 0.3648 | 0.3528 | 0.3591 | 0.4327 |
ctgN1vsN3 | 0.6542 | 0.7310 | 0.7479 | 0.6968 | 0.7444 | 0.7991 | 0.7580 | 0.7374 | 0.7511 | 0.7332 | 0.7546 | 0.7590 | 0.7376 | 0.7521 | 0.7986 |
ctgN2 | 0.6378 | 0.7436 | 0.7387 | 0.7115 | 0.7220 | 0.7159 | 0.7034 | 0.7519 | 0.7226 | 0.7258 | 0.7196 | 0.7364 | 0.7419 | 0.7644 | 0.7428 |
ctgN3 | 0.7476 | 0.8095 | 0.7964 | 0.7555 | 0.7858 | 0.7336 | 0.7686 | 0.7587 | 0.7462 | 0.7680 | 0.7904 | 0.7725 | 0.7531 | 0.7549 | 0.8073 |
wilt | 0.5625 | 0.6991 | 0.7415 | 0.6423 | 0.7404 | 0.6985 | 0.7840 | 0.7379 | 0.6581 | 0.8132 | 0.7310 | 0.7796 | 0.7529 | 0.7522 | 0.8461 |
mfeatmor01 | 0.9089 | 0.9735 | 0.9818 | 0.9527 | 0.9886 | 0.9532 | 0.9897 | 0.9436 | 0.9528 | 0.9789 | 0.9901 | 0.9592 | 0.9301 | 0.9459 | 0.9711 |
mfeatmor012 | 0.8546 | 0.8369 | 0.8226 | 0.8990 | 0.8467 | 0.8971 | 0.8858 | 0.8442 | 0.8325 | 0.8790 | 0.8687 | 0.8884 | 0.8628 | 0.8844 | 0.9122 |
seeds2v13 | 0.9776 | 0.9664 | 0.9685 | 0.9702 | 0.9573 | 0.9696 | 0.9668 | 0.9794 | 0.9801 | 0.9642 | 0.9715 | 0.9733 | 0.9598 | 0.9696 | 0.9798 |
segment1 | 0.8806 | 0.9428 | 0.9335 | 0.9010 | 0.9552 | 0.9336 | 0.9344 | 0.9522 | 0.9121 | 0.9258 | 0.9306 | 0.9357 | 0.9281 | 0.9194 | 0.9479 |
segment12 | 0.9562 | 0.9314 | 0.9210 | 0.9668 | 0.9506 | 0.9632 | 0.9545 | 0.9619 | 0.9454 | 0.9508 | 0.9612 | 0.9675 | 0.9449 | 0.9491 | 0.9677 |
segment123 | 0.9986 | 0.9847 | 0.9859 | 0.9987 | 0.9856 | 0.9981 | 1.0000 | 0.9992 | 0.9992 | 0.9983 | 0.9869 | 1.0000 | 0.9906 | 0.9984 | 1.0000 |
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.7211 | 0.7202 | 0.7198 | 0.7454 | 0.7219 | 0.7418 | 0.7529 | 0.7622 | 0.7619 | 0.7588 | 0.7641 | 0.7832 | 0.7699 | 0.7701 | 0.8109 |
wisconsin | 0.9796 | 0.9701 | 0.9732 | 0.9769 | 0.9680 | 0.9745 | 0.9761 | 0.9687 | 0.9594 | 0.9732 | 0.9858 | 0.9761 | 0.9752 | 0.9747 | 0.9830 |
pima | 0.6733 | 0.6566 | 0.6721 | 0.6831 | 0.6702 | 0.6801 | 0.7673 | 0.7328 | 0.7285 | 0.7271 | 0.7166 | 0.7330 | 0.7575 | 0.7520 | 0.7793 |
haberman | 0.6010 | 0.5994 | 0.6004 | 0.6002 | 0.6118 | 0.6233 | 0.6219 | 0.6258 | 0.6146 | 0.6375 | 0.5998 | 0.6425 | 0.6333 | 0.6379 | 0.6412 |
vehicle1 | 0.7858 | 0.7741 | 0.7759 | 0.7901 | 0.7722 | 0.7910 | 0.7853 | 0.7766 | 0.7692 | 0.7581 | 0.7834 | 0.8022 | 0.7931 | 0.8018 | 0.8356 |
new-thyroid1 | 0.9792 | 0.9801 | 0.9847 | 0.9818 | 0.9836 | 0.9790 | 0.9829 | 0.9715 | 0.9718 | 0.9844 | 0.9901 | 0.9774 | 0.9855 | 0.9882 | 0.9899 |
yeast3 | 0.8993 | 0.8899 | 0.8706 | 0.8944 | 0.9052 | 0.8911 | 0.9237 | 0.8901 | 0.8877 | 0.9018 | 0.9134 | 0.9075 | 0.8969 | 0.8993 | 0.9105 |
ecoli3 | 0.8580 | 0.8681 | 0.8706 | 0.8478 | 0.8644 | 0.8590 | 0.8647 | 0.8760 | 0.8791 | 0.8632 | 0.8661 | 0.8744 | 0.8709 | 0.8776 | 0.8924 |
vowel0 | 0.9978 | 0.9963 | 0.9981 | 0.9972 | 0.9950 | 0.9981 | 1.0000 | 0.9935 | 0.9962 | 1.0000 | 0.9971 | 1.0000 | 0.9994 | 0.9991 | 1.0000 |
yeast-1_vs_7 | 0.5678 | 0.6882 | 0.7109 | 0.5945 | 0.7208 | 0.7044 | 0.6786 | 0.7430 | 0.7238 | 0.7196 | 0.7200 | 0.6959 | 0.7006 | 0.7159 | 0.7455 |
ecoli4 | 0.9196 | 0.8998 | 0.9106 | 0.9042 | 0.9119 | 0.9208 | 0.9198 | 0.9304 | 0.9196 | 0.9211 | 0.9307 | 0.9256 | 0.9388 | 0.9306 | 0.9329 |
abalone9-18 | 0.6495 | 0.6784 | 0.6513 | 0.6690 | 0.6755 | 0.6821 | 0.7492 | 0.7326 | 0.7419 | 0.7430 | 0.7374 | 0.7513 | 0.7333 | 0.7420 | 0.7764 |
shuttle-c2-vs-c4 | 0.9971 | 0.9984 | 1.0000 | 0.9975 | 1.0000 | 0.9995 | 0.9977 | 0.9941 | 0.9986 | 0.9997 | 1.0000 | 0.9982 | 0.9995 | 1.0000 | 1.0000 |
yeast4 | 0.8139 | 0.8057 | 0.8236 | 0.8294 | 0.8175 | 0.8277 | 0.8310 | 0.8194 | 0.8196 | 0.8079 | 0.8221 | 0.8174 | 0.8395 | 0.8358 | 0.8379 |
yeast5 | 0.9535 | 0.9457 | 0.9674 | 0.9521 | 0.9443 | 0.9576 | 0.9495 | 0.9478 | 0.9406 | 0.9528 | 0.9663 | 0.9712 | 0.9591 | 0.9544 | 0.9642 |
abalone19 | 0.3447 | 0.7240 | 0.7461 | 0.3290 | 0.6973 | 0.7276 | 0.5969 | 0.7085 | 0.4919 | 0.6738 | 0.7092 | 0.7329 | 0.7170 | 0.7181 | 0.7702 |
ecoli-0-3-4_vs_5 | 0.8781 | 0.8857 | 0.8796 | 0.8981 | 0.8848 | 0.8932 | 0.8921 | 0.8830 | 0.8829 | 0.8990 | 0.8998 | 0.8776 | 0.8939 | 0.8901 | 0.8957 |
ecoli-0-6-7_vs_3-5 | 0.8306 | 0.7998 | 0.8081 | 0.8272 | 0.8198 | 0.8255 | 0.8551 | 0.8030 | 0.8212 | 0.8187 | 0.8190 | 0.8464 | 0.8211 | 0.8186 | 0.8539 |
yeast-0-3-5-9_vs_7-8 | 0.6997 | 0.5687 | 0.5868 | 0.6739 | 0.5710 | 0.6845 | 0.6861 | 0.6932 | 0.6911 | 0.6954 | 0.6878 | 0.6759 | 0.6922 | 0.6878 | 0.7154 |
yeast-0-3-5-9_vs_7-8 | 0.8874 | 0.9028 | 0.9012 | 0.8966 | 0.8998 | 0.8775 | 0.8964 | 0.8981 | 0.8942 | 0.8911 | 0.8886 | 0.8932 | 0.8897 | 0.8933 | 0.8959 |
ecoli-0-1_vs_2-3-5 | 0.8377 | 0.8119 | 0.8088 | 0.8394 | 0.8299 | 0.8301 | 0.8442 | 0.8527 | 0.8504 | 0.8425 | 0.8501 | 0.8499 | 0.8455 | 0.8479 | 0.8697 |
ecoli-0-6-7_vs_5 | 0.8000 | 0.8642 | 0.8591 | 0.8320 | 0.8529 | 0.8617 | 0.8832 | 0.8653 | 0.8724 | 0.8695 | 0.8610 | 0.8736 | 0.8575 | 0.8531 | 0.8792 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.8534 | 0.8631 | 0.8670 | 0.8498 | 0.8692 | 0.8506 | 0.8772 | 0.8650 | 0.8762 | 0.8696 | 0.8779 | 0.8699 | 0.8858 | 0.8744 | 0.8821 |
ecoli-0-1_vs_5 | 0.8850 | 0.8952 | 0.8926 | 0.8901 | 0.8974 | 0.8833 | 0.8906 | 0.8864 | 0.8732 | 0.8830 | 0.8868 | 0.8953 | 0.8862 | 0.8934 | 0.9195 |
shuttle-6_vs_2-3 | 0.8586 | 0.7760 | 0.7793 | 0.8228 | 0.7896 | 0.8499 | 0.8934 | 0.8820 | 0.8959 | 0.8876 | 0.9030 | 0.9123 | 0.8896 | 0.8867 | 0.9095 |
flare-F | 0.8218 | 0.7477 | 0.7652 | 0.8333 | 0.7501 | 0.8198 | 0.6744 | 0.8219 | 0.7946 | 0.8079 | 0.8139 | 0.8448 | 0.7989 | 0.8016 | 0.8337 |
winequality-red-4 | 0.5919 | 0.6732 | 0.6673 | 0.5792 | 0.6545 | 0.5808 | 0.6273 | 0.5898 | 0.5170 | 0.6914 | 0.6632 | 0.6500 | 0.6705 | 0.6829 | 0.7173 |
shuttle-2_vs_5 | 1.0000 | 0.9982 | 1.0000 | 0.9711 | 0.9976 | 1.0000 | 1.0000 | 0.9774 | 0.9823 | 0.9954 | 0.9987 | 1.0000 | 0.9965 | 0.9988 | 1.0000 |
poker-8-9_vs_5 | 0.3317 | 0.6053 | 0.6149 | 0.3140 | 0.5782 | 0.3438 | 0.5562 | 0.3710 | 0.3929 | 0.4618 | 0.4491 | 0.5312 | 0.4700 | 0.4854 | 0.6157 |
poker-8_vs_6 | 0.5252 | 0.7410 | 0.7299 | 0.5740 | 0.7318 | 0.5237 | 0.7098 | 0.6536 | 0.6790 | 0.7492 | 0.7008 | 0.7129 | 0.7190 | 0.7152 | 0.7733 |
banknote | 0.9876 | 0.9810 | 0.9809 | 0.9913 | 0.9804 | 0.9821 | 0.9856 | 0.9830 | 0.9777 | 0.9822 | 0.9896 | 0.9811 | 0.9834 | 0.9791 | 0.9850 |
ctgC5 | 0.5134 | 0.6658 | 0.6233 | 0.5730 | 0.6419 | 0.5099 | 0.5858 | 0.5736 | 0.6297 | 0.6419 | 0.6322 | 0.6785 | 0.6412 | 0.6430 | 0.6859 |
ctgN1vsN3 | 0.8297 | 0.8170 | 0.8254 | 0.8137 | 0.8350 | 0.8445 | 0.8454 | 0.8361 | 0.8299 | 0.8317 | 0.8471 | 0.8360 | 0.8338 | 0.8415 | 0.8447 |
ctgN2 | 0.8052 | 0.7969 | 0.8011 | 0.7994 | 0.8138 | 0.7876 | 0.8452 | 0.8092 | 0.7956 | 0.8004 | 0.8123 | 0.8223 | 0.7964 | 0.8558 | 0.8770 |
ctgN3 | 0.8211 | 0.8732 | 0.8536 | 0.8341 | 0.8840 | 0.8109 | 0.8444 | 0.8277 | 0.8309 | 0.8650 | 0.8233 | 0.8766 | 0.8427 | 0.8501 | 0.8766 |
wilt | 0.7691 | 0.7038 | 0.6852 | 0.7222 | 0.7001 | 0.7559 | 0.7981 | 0.7226 | 0.7028 | 0.8006 | 0.7304 | 0.7578 | 0.7449 | 0.7611 | 0.7999 |
mfeatmor01 | 0.9521 | 0.9815 | 0.9874 | 0.9662 | 0.9901 | 0.9633 | 0.9884 | 0.9601 | 0.9732 | 0.9807 | 0.9874 | 0.9663 | 0.9406 | 0.9535 | 0.9715 |
mfeatmor012 | 0.8998 | 0.8227 | 0.8376 | 0.9015 | 0.8778 | 0.9192 | 0.9135 | 0.9004 | 0.8858 | 0.9030 | 0.8564 | 0.9002 | 0.8568 | 0.8774 | 0.9323 |
seeds2v13 | 0.9855 | 0.9724 | 0.9789 | 0.9802 | 0.9693 | 0.9891 | 0.9868 | 0.9780 | 0.9876 | 0.9905 | 0.9872 | 0.9901 | 0.9701 | 0.9743 | 0.9856 |
segment1 | 0.9423 | 0.9637 | 0.9716 | 0.9332 | 0.9630 | 0.9236 | 0.9579 | 0.9411 | 0.9619 | 0.9690 | 0.9848 | 0.9597 | 0.9481 | 0.9577 | 0.9700 |
segment12 | 0.9701 | 0.9539 | 0.9611 | 0.9678 | 0.9555 | 0.9690 | 0.9772 | 0.9571 | 0.9580 | 0.9747 | 0.9689 | 0.9677 | 0.9704 | 0.9708 | 0.9882 |
segment123 | 0.9991 | 0.9897 | 0.9879 | 0.9994 | 0.9834 | 0.9990 | 1.0000 | 0.9995 | 0.9994 | 0.9989 | 0.9881 | 1.0000 | 0.9935 | 0.9991 | 1.0000 |
Rank | Combination | # data sets performing best | |
1 | MPBU + Support Vector Machine | 7 | |
2 | UFFDFR + Logistic Regression | 5 | |
3 | MWMOTE + Extreme Learning Machine | 4 | |
4 | MWMOTE + Support Vector Machine | 3 | |
5 | UFFDER + CART | 3 | |
6 | GA-SMOTE + Support Vector Machine | 2 | |
7 | MPBU + KNN | 2 | |
8 | MWMOTE + Support Vector Machine | 2 | |
9 | GA-SMOTE + Extreme Learning Machine | 2 | |
10 | MPBU + Extreme Learning Machine | 2 |
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_RUS | 15 | 26 | 1 | 14 | 25 | 3 |
SEHE vs. SEHE_UFFDFR | 13 | 28 | 1 | 14 | 28 | 0 |
SEHE vs. SEHE_MPBU | 11 | 29 | 2 | 15 | 26 | 1 |
SEHE vs. SEHE_ROS | 10 | 31 | 1 | 12 | 28 | 2 |
SEHE vs. SEHE_SMOTE | 7 | 33 | 2 | 9 | 29 | 4 |
SEHE vs. SEHE_MWMOTE | 6 | 34 | 2 | 10 | 29 | 3 |
SEHE vs. SEHE_GA-SMOTE | 6 | 32 | 4 | 8 | 33 | 1 |
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_CART | 17 | 25 | 0 | 19 | 22 | 1 |
SEHE vs. SEHE_KNN | 20 | 21 | 1 | 23 | 19 | 0 |
SEHE vs. SEHE_SVM | 15 | 22 | 5 | 17 | 22 | 3 |
SEHE vs. SEHE_ELM | 13 | 26 | 3 | 15 | 24 | 3 |
SEHE vs. SEHE_LR | 18 | 24 | 0 | 21 | 20 | 1 |
SEHE vs. SEHE_NB | 25 | 17 | 0 | 23 | 18 | 1 |
sampling approach | classification method | ||
s1 | RUS [9] | k1 | decision tree (CART) |
s2 | UFFDFR [12] | k2 | K-nearest neighbors (KNN) |
s3 | MPBU [13] | k3 | support vector machine (SVM) with Gaussian kernel |
s4 | ROS [9] | k4 | extreme learning machine (ELM) with sigmoid activation function |
s5 | SMOTE [11] | k5 | logistic regression (LR) |
s6 | MWMOTE [8] | k6 | naïve Bayes classifier (NB) |
s7 | GA-SMOTE [16] |
Dataset | #Instances | #Features | IR | Dataset | #Instances | #Features | IR |
glass1 | 214 | 9 | 1.82 | ecoli-0-6-7_vs_5 | 220 | 6 | 10.00 |
wisconsin | 683 | 9 | 1.86 | led7digit-0-2-4-5-6-7-8-9_vs_1 | 443 | 7 | 10.97 |
pima | 768 | 8 | 1.87 | ecoli-0-1_vs_5 | 240 | 6 | 11.00 |
haberman | 306 | 3 | 2.78 | shuttle-6_vs_2-3 | 230 | 9 | 22.00 |
vehicle1 | 846 | 18 | 2.90 | flare-F | 1066 | 11 | 23.79 |
new-thyroid1 | 215 | 5 | 5.14 | winequality-red-4 | 1599 | 11 | 29.17 |
yeast3 | 1484 | 8 | 8.10 | shuttle-2_vs_5 | 3316 | 9 | 66.67 |
ecoli3 | 336 | 7 | 8.60 | poker-8-9_vs_5 | 2075 | 10 | 82.00 |
vowel0 | 988 | 13 | 9.98 | poker-8_vs_6 | 1477 | 10 | 85.88 |
yeast-1_vs_7 | 459 | 7 | 14.30 | banknote | 1372 | 4 | 1.25 |
ecoli4 | 336 | 7 | 15.80 | ctgC5 | 2126 | 21 | 28.50 |
abalone9-18 | 731 | 8 | 16.40 | ctgN1vsN3 | 2126 | 21 | 3.50 |
shuttle-c2-vs-c4 | 129 | 9 | 20.90 | ctgN2 | 2126 | 21 | 6.20 |
yeast4 | 1484 | 8 | 28.10 | ctgN3 | 2126 | 21 | 11.10 |
yeast5 | 1484 | 8 | 32.73 | wilt | 4839 | 5 | 17.50 |
abalone19 | 4174 | 8 | 129.44 | mfeatmor01 | 2000 | 6 | 4.00 |
ecoli-0-3-4_vs_5 | 200 | 7 | 9.00 | mfeatmor012 | 2000 | 6 | 2.30 |
ecoli-0-6-7_vs_3-5 | 222 | 7 | 9.09 | seeds2v13 | 210 | 7 | 2.00 |
yeast-0-3-5-9_vs_7-8 | 506 | 8 | 9.12 | segment1 | 2310 | 19 | 6.00 |
yeast-0-3-5-9_vs_7-8 | 1004 | 8 | 9.14 | segment12 | 2310 | 19 | 2.50 |
ecoli-0-1_vs_2-3-5 | 244 | 7 | 9.17 | segment123 | 2310 | 19 | 1.30 |
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.6179 | 0.6418 | 0.6566 | 0.6531 | 0.6479 | 0.6710 | 0.7022 | 0.6937 | 0.6819 | 0.7074 | 0.7196 | 0.7338 | 0.7292 | 0.7231 | 0.7479 |
wisconsin | 0.9496 | 0.9501 | 0.9532 | 0.9458 | 0.9526 | 0.9531 | 0.9368 | 0.9444 | 0.9496 | 0.9537 | 0.9486 | 0.9529 | 0.9501 | 0.9529 | 0.9685 |
pima | 0.6074 | 0.6291 | 0.6399 | 0.6518 | 0.6436 | 0.6457 | 0.6706 | 0.6619 | 0.6538 | 0.6560 | 0.6489 | 0.6917 | 0.6742 | 0.6721 | 0.6898 |
haberman | 0.4310 | 0.4036 | 0.4117 | 0.4425 | 0.4328 | 0.4679 | 0.4658 | 0.4631 | 0.4254 | 0.4738 | 0.4699 | 0.4852 | 0.4798 | 0.4881 | 0.5122 |
vehicle1 | 0.6829 | 0.6473 | 0.6428 | 0.6716 | 0.6632 | 0.6811 | 0.6793 | 0.6627 | 0.6519 | 0.6521 | 0.7044 | 0.6759 | 0.6792 | 0.6721 | 0.7022 |
new-thyroid1 | 0.9275 | 0.9310 | 0.9288 | 0.9219 | 0.9468 | 0.9390 | 0.9426 | 0.9410 | 0.9272 | 0.9587 | 0.9446 | 0.9510 | 0.9451 | 0.9439 | 0.9517 |
yeast3 | 0.6866 | 0.6922 | 0.7014 | 0.7250 | 0.7033 | 0.7315 | 0.7446 | 0.7361 | 0.7410 | 0.7398 | 0.7316 | 0.7425 | 0.7355 | 0.7398 | 0.7461 |
ecoli3 | 0.5632 | 0.5791 | 0.6318 | 0.6007 | 0.6101 | 0.6049 | 0.6251 | 0.5735 | 0.5816 | 0.5835 | 0.5961 | 0.6032 | 0.6397 | 0.6381 | 0.6441 |
vowel0 | 0.9489 | 0.9876 | 0.9941 | 0.9774 | 0.9986 | 0.9826 | 0.9978 | 0.9598 | 0.9689 | 0.9936 | 0.9972 | 1.0000 | 0.9983 | 0.9973 | 1.0000 |
yeast-1_vs_7 | 0.2816 | 0.2459 | 0.4109 | 0.3828 | 0.4350 | 0.3744 | 0.3610 | 0.2969 | 0.2871 | 0.3980 | 0.3792 | 0.4160 | 0.3967 | 0.4159 | 0.4287 |
ecoli4 | 0.6491 | 0.7387 | 0.7519 | 0.6393 | 0.7275 | 0.7080 | 0.7441 | 0.6146 | 0.7269 | 0.7484 | 0.7175 | 0.7290 | 0.7448 | 0.7641 | 0.7548 |
abalone9-18 | 0.3353 | 0.3798 | 0.4314 | 0.3722 | 0.3918 | 0.3816 | 0.4277 | 0.2934 | 0.2987 | 0.3970 | 0.4052 | 0.4329 | 0.4320 | 0.4388 | 0.4530 |
shuttle-c2-vs-c4 | 0.9964 | 0.9981 | 1.0000 | 0.9981 | 1.0000 | 0.9993 | 0.9976 | 0.9952 | 0.9971 | 0.9994 | 1.0000 | 0.9972 | 0.9991 | 1.0000 | 1.0000 |
yeast4 | 0.2798 | 0.3362 | 0.3517 | 0.3394 | 0.3707 | 0.3239 | 0.3949 | 0.2875 | 0.2987 | 0.3708 | 0.3646 | 0.3802 | 0.3569 | 0.3694 | 0.3743 |
yeast5 | 0.5967 | 0.6652 | 0.6743 | 0.5969 | 0.6837 | 0.6345 | 0.6874 | 0.5350 | 0.5989 | 0.6947 | 0.6528 | 0.6777 | 0.6931 | 0.6895 | 0.6933 |
abalone19 | 0.0372 | 0.0991 | 0.0869 | 0.0417 | 0.1132 | 0.0507 | 0.0962 | 0.0487 | 0.0396 | 0.1133 | 0.1521 | 0.1312 | 0.1479 | 0.1455 | 0.1752 |
ecoli-0-3-4_vs_5 | 0.7109 | 0.7983 | 0.8109 | 0.7329 | 0.8086 | 0.7751 | 0.8108 | 0.6954 | 0.7374 | 0.7930 | 0.8028 | 0.8039 | 0.8184 | 0.7999 | 0.8481 |
ecoli-0-6-7_vs_3-5 | 0.5819 | 0.6343 | 0.6420 | 0.5927 | 0.6588 | 0.6447 | 0.6736 | 0.5210 | 0.6001 | 0.6542 | 0.6571 | 0.6848 | 0.6352 | 0.6572 | 0.6791 |
yeast-0-3-5-9_vs_7-8 | 0.2933 | 0.3231 | 0.3490 | 0.2815 | 0.3140 | 0.3392 | 0.3431 | 0.3089 | 0.3076 | 0.3418 | 0.3396 | 0.3210 | 0.3317 | 0.3396 | 0.3991 |
yeast-0-3-5-9_vs_7-8 | 0.6956 | 0.7722 | 0.7630 | 0.6717 | 0.7448 | 0.7134 | 0.7520 | 0.6837 | 0.7474 | 0.7541 | 0.7498 | 0.7526 | 0.7279 | 0.7579 | 0.7890 |
ecoli-0-1_vs_2-3-5 | 0.5549 | 0.6627 | 0.6788 | 0.5610 | 0.6472 | 0.6081 | 0.6654 | 0.6344 | 0.6290 | 0.6708 | 0.6587 | 0.6842 | 0.6738 | 0.6688 | 0.7223 |
ecoli-0-6-7_vs_5 | 0.5345 | 0.6860 | 0.7438 | 0.6633 | 0.7297 | 0.7515 | 0.7337 | 0.6125 | 0.7120 | 0.6995 | 0.7498 | 0.7355 | 0.7378 | 0.7394 | 0.7511 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.6627 | 0.7413 | 0.7427 | 0.6580 | 0.7336 | 0.7189 | 0.7355 | 0.6528 | 0.7066 | 0.7209 | 0.7515 | 0.7320 | 0.7458 | 0.7468 | 0.7515 |
ecoli-0-1_vs_5 | 0.7440 | 0.7752 | 0.7836 | 0.7506 | 0.7878 | 0.7441 | 0.7642 | 0.6784 | 0.7043 | 0.7531 | 0.7752 | 0.7807 | 0.7735 | 0.7742 | 0.7835 |
shuttle-6_vs_2-3 | 0.8929 | 0.7304 | 0.7543 | 0.8891 | 0.7984 | 0.8894 | 0.8906 | 0.8814 | 0.8839 | 0.8539 | 0.8901 | 0.8872 | 0.8756 | 0.8795 | 0.9194 |
flare-F | 0.1909 | 0.2542 | 0.2798 | 0.2590 | 0.2711 | 0.2433 | 0.2677 | 0.2516 | 0.2702 | 0.2710 | 0.2688 | 0.2854 | 0.2860 | 0.2991 | 0.3796 |
winequality-red-4 | 0.1277 | 0.1792 | 0.2098 | 0.1744 | 0.1811 | 0.1795 | 0.1811 | 0.1342 | 0.0970 | 0.1979 | 0.1594 | 0.1945 | 0.1986 | 0.2333 | 0.2691 |
shuttle-2_vs_5 | 1.0000 | 0.9951 | 1.0000 | 0.9729 | 0.9982 | 1.0000 | 1.0000 | 0.9684 | 0.9797 | 0.9968 | 0.9981 | 1.0000 | 0.9952 | 0.9954 | 1.0000 |
poker-8-9_vs_5 | 0.0372 | 0.0661 | 0.0971 | 0.0422 | 0.1157 | 0.0679 | 0.1011 | 0.0385 | 0.0406 | 0.0920 | 0.1008 | 0.1106 | 0.1033 | 0.1176 | 0.1795 |
poker-8_vs_6 | 0.2906 | 0.6790 | 0.6929 | 0.2409 | 0.6508 | 0.4411 | 0.4076 | 0.1919 | 0.2143 | 0.5080 | 0.6945 | 0.6991 | 0.6876 | 0.6734 | 0.7394 |
banknote | 0.9828 | 0.9816 | 0.9830 | 0.9901 | 0.9749 | 0.9826 | 0.9724 | 0.9849 | 0.9751 | 0.9810 | 0.9886 | 0.9815 | 0.9843 | 0.9799 | 0.9896 |
ctgC5 | 0.2820 | 0.3144 | 0.3462 | 0.2517 | 0.3727 | 0.2342 | 0.3109 | 0.3519 | 0.3442 | 0.3031 | 0.3572 | 0.3648 | 0.3528 | 0.3591 | 0.4327 |
ctgN1vsN3 | 0.6542 | 0.7310 | 0.7479 | 0.6968 | 0.7444 | 0.7991 | 0.7580 | 0.7374 | 0.7511 | 0.7332 | 0.7546 | 0.7590 | 0.7376 | 0.7521 | 0.7986 |
ctgN2 | 0.6378 | 0.7436 | 0.7387 | 0.7115 | 0.7220 | 0.7159 | 0.7034 | 0.7519 | 0.7226 | 0.7258 | 0.7196 | 0.7364 | 0.7419 | 0.7644 | 0.7428 |
ctgN3 | 0.7476 | 0.8095 | 0.7964 | 0.7555 | 0.7858 | 0.7336 | 0.7686 | 0.7587 | 0.7462 | 0.7680 | 0.7904 | 0.7725 | 0.7531 | 0.7549 | 0.8073 |
wilt | 0.5625 | 0.6991 | 0.7415 | 0.6423 | 0.7404 | 0.6985 | 0.7840 | 0.7379 | 0.6581 | 0.8132 | 0.7310 | 0.7796 | 0.7529 | 0.7522 | 0.8461 |
mfeatmor01 | 0.9089 | 0.9735 | 0.9818 | 0.9527 | 0.9886 | 0.9532 | 0.9897 | 0.9436 | 0.9528 | 0.9789 | 0.9901 | 0.9592 | 0.9301 | 0.9459 | 0.9711 |
mfeatmor012 | 0.8546 | 0.8369 | 0.8226 | 0.8990 | 0.8467 | 0.8971 | 0.8858 | 0.8442 | 0.8325 | 0.8790 | 0.8687 | 0.8884 | 0.8628 | 0.8844 | 0.9122 |
seeds2v13 | 0.9776 | 0.9664 | 0.9685 | 0.9702 | 0.9573 | 0.9696 | 0.9668 | 0.9794 | 0.9801 | 0.9642 | 0.9715 | 0.9733 | 0.9598 | 0.9696 | 0.9798 |
segment1 | 0.8806 | 0.9428 | 0.9335 | 0.9010 | 0.9552 | 0.9336 | 0.9344 | 0.9522 | 0.9121 | 0.9258 | 0.9306 | 0.9357 | 0.9281 | 0.9194 | 0.9479 |
segment12 | 0.9562 | 0.9314 | 0.9210 | 0.9668 | 0.9506 | 0.9632 | 0.9545 | 0.9619 | 0.9454 | 0.9508 | 0.9612 | 0.9675 | 0.9449 | 0.9491 | 0.9677 |
segment123 | 0.9986 | 0.9847 | 0.9859 | 0.9987 | 0.9856 | 0.9981 | 1.0000 | 0.9992 | 0.9992 | 0.9983 | 0.9869 | 1.0000 | 0.9906 | 0.9984 | 1.0000 |
Dataset | Under Bagging |
Over Bagging |
SMOTE Bagging |
RUSBoost | SMOTE Boost |
EUSBoost | EnSVM -OTHR |
Easy Ensemble |
Balance Cascade |
GIREnUS | 2Obj* | ECO -Ensemble |
HeteroEn | HE | SEHE |
glass1 | 0.7211 | 0.7202 | 0.7198 | 0.7454 | 0.7219 | 0.7418 | 0.7529 | 0.7622 | 0.7619 | 0.7588 | 0.7641 | 0.7832 | 0.7699 | 0.7701 | 0.8109 |
wisconsin | 0.9796 | 0.9701 | 0.9732 | 0.9769 | 0.9680 | 0.9745 | 0.9761 | 0.9687 | 0.9594 | 0.9732 | 0.9858 | 0.9761 | 0.9752 | 0.9747 | 0.9830 |
pima | 0.6733 | 0.6566 | 0.6721 | 0.6831 | 0.6702 | 0.6801 | 0.7673 | 0.7328 | 0.7285 | 0.7271 | 0.7166 | 0.7330 | 0.7575 | 0.7520 | 0.7793 |
haberman | 0.6010 | 0.5994 | 0.6004 | 0.6002 | 0.6118 | 0.6233 | 0.6219 | 0.6258 | 0.6146 | 0.6375 | 0.5998 | 0.6425 | 0.6333 | 0.6379 | 0.6412 |
vehicle1 | 0.7858 | 0.7741 | 0.7759 | 0.7901 | 0.7722 | 0.7910 | 0.7853 | 0.7766 | 0.7692 | 0.7581 | 0.7834 | 0.8022 | 0.7931 | 0.8018 | 0.8356 |
new-thyroid1 | 0.9792 | 0.9801 | 0.9847 | 0.9818 | 0.9836 | 0.9790 | 0.9829 | 0.9715 | 0.9718 | 0.9844 | 0.9901 | 0.9774 | 0.9855 | 0.9882 | 0.9899 |
yeast3 | 0.8993 | 0.8899 | 0.8706 | 0.8944 | 0.9052 | 0.8911 | 0.9237 | 0.8901 | 0.8877 | 0.9018 | 0.9134 | 0.9075 | 0.8969 | 0.8993 | 0.9105 |
ecoli3 | 0.8580 | 0.8681 | 0.8706 | 0.8478 | 0.8644 | 0.8590 | 0.8647 | 0.8760 | 0.8791 | 0.8632 | 0.8661 | 0.8744 | 0.8709 | 0.8776 | 0.8924 |
vowel0 | 0.9978 | 0.9963 | 0.9981 | 0.9972 | 0.9950 | 0.9981 | 1.0000 | 0.9935 | 0.9962 | 1.0000 | 0.9971 | 1.0000 | 0.9994 | 0.9991 | 1.0000 |
yeast-1_vs_7 | 0.5678 | 0.6882 | 0.7109 | 0.5945 | 0.7208 | 0.7044 | 0.6786 | 0.7430 | 0.7238 | 0.7196 | 0.7200 | 0.6959 | 0.7006 | 0.7159 | 0.7455 |
ecoli4 | 0.9196 | 0.8998 | 0.9106 | 0.9042 | 0.9119 | 0.9208 | 0.9198 | 0.9304 | 0.9196 | 0.9211 | 0.9307 | 0.9256 | 0.9388 | 0.9306 | 0.9329 |
abalone9-18 | 0.6495 | 0.6784 | 0.6513 | 0.6690 | 0.6755 | 0.6821 | 0.7492 | 0.7326 | 0.7419 | 0.7430 | 0.7374 | 0.7513 | 0.7333 | 0.7420 | 0.7764 |
shuttle-c2-vs-c4 | 0.9971 | 0.9984 | 1.0000 | 0.9975 | 1.0000 | 0.9995 | 0.9977 | 0.9941 | 0.9986 | 0.9997 | 1.0000 | 0.9982 | 0.9995 | 1.0000 | 1.0000 |
yeast4 | 0.8139 | 0.8057 | 0.8236 | 0.8294 | 0.8175 | 0.8277 | 0.8310 | 0.8194 | 0.8196 | 0.8079 | 0.8221 | 0.8174 | 0.8395 | 0.8358 | 0.8379 |
yeast5 | 0.9535 | 0.9457 | 0.9674 | 0.9521 | 0.9443 | 0.9576 | 0.9495 | 0.9478 | 0.9406 | 0.9528 | 0.9663 | 0.9712 | 0.9591 | 0.9544 | 0.9642 |
abalone19 | 0.3447 | 0.7240 | 0.7461 | 0.3290 | 0.6973 | 0.7276 | 0.5969 | 0.7085 | 0.4919 | 0.6738 | 0.7092 | 0.7329 | 0.7170 | 0.7181 | 0.7702 |
ecoli-0-3-4_vs_5 | 0.8781 | 0.8857 | 0.8796 | 0.8981 | 0.8848 | 0.8932 | 0.8921 | 0.8830 | 0.8829 | 0.8990 | 0.8998 | 0.8776 | 0.8939 | 0.8901 | 0.8957 |
ecoli-0-6-7_vs_3-5 | 0.8306 | 0.7998 | 0.8081 | 0.8272 | 0.8198 | 0.8255 | 0.8551 | 0.8030 | 0.8212 | 0.8187 | 0.8190 | 0.8464 | 0.8211 | 0.8186 | 0.8539 |
yeast-0-3-5-9_vs_7-8 | 0.6997 | 0.5687 | 0.5868 | 0.6739 | 0.5710 | 0.6845 | 0.6861 | 0.6932 | 0.6911 | 0.6954 | 0.6878 | 0.6759 | 0.6922 | 0.6878 | 0.7154 |
yeast-0-3-5-9_vs_7-8 | 0.8874 | 0.9028 | 0.9012 | 0.8966 | 0.8998 | 0.8775 | 0.8964 | 0.8981 | 0.8942 | 0.8911 | 0.8886 | 0.8932 | 0.8897 | 0.8933 | 0.8959 |
ecoli-0-1_vs_2-3-5 | 0.8377 | 0.8119 | 0.8088 | 0.8394 | 0.8299 | 0.8301 | 0.8442 | 0.8527 | 0.8504 | 0.8425 | 0.8501 | 0.8499 | 0.8455 | 0.8479 | 0.8697 |
ecoli-0-6-7_vs_5 | 0.8000 | 0.8642 | 0.8591 | 0.8320 | 0.8529 | 0.8617 | 0.8832 | 0.8653 | 0.8724 | 0.8695 | 0.8610 | 0.8736 | 0.8575 | 0.8531 | 0.8792 |
led7digit-0-2-4-5-6-7-8-9_vs_1 | 0.8534 | 0.8631 | 0.8670 | 0.8498 | 0.8692 | 0.8506 | 0.8772 | 0.8650 | 0.8762 | 0.8696 | 0.8779 | 0.8699 | 0.8858 | 0.8744 | 0.8821 |
ecoli-0-1_vs_5 | 0.8850 | 0.8952 | 0.8926 | 0.8901 | 0.8974 | 0.8833 | 0.8906 | 0.8864 | 0.8732 | 0.8830 | 0.8868 | 0.8953 | 0.8862 | 0.8934 | 0.9195 |
shuttle-6_vs_2-3 | 0.8586 | 0.7760 | 0.7793 | 0.8228 | 0.7896 | 0.8499 | 0.8934 | 0.8820 | 0.8959 | 0.8876 | 0.9030 | 0.9123 | 0.8896 | 0.8867 | 0.9095 |
flare-F | 0.8218 | 0.7477 | 0.7652 | 0.8333 | 0.7501 | 0.8198 | 0.6744 | 0.8219 | 0.7946 | 0.8079 | 0.8139 | 0.8448 | 0.7989 | 0.8016 | 0.8337 |
winequality-red-4 | 0.5919 | 0.6732 | 0.6673 | 0.5792 | 0.6545 | 0.5808 | 0.6273 | 0.5898 | 0.5170 | 0.6914 | 0.6632 | 0.6500 | 0.6705 | 0.6829 | 0.7173 |
shuttle-2_vs_5 | 1.0000 | 0.9982 | 1.0000 | 0.9711 | 0.9976 | 1.0000 | 1.0000 | 0.9774 | 0.9823 | 0.9954 | 0.9987 | 1.0000 | 0.9965 | 0.9988 | 1.0000 |
poker-8-9_vs_5 | 0.3317 | 0.6053 | 0.6149 | 0.3140 | 0.5782 | 0.3438 | 0.5562 | 0.3710 | 0.3929 | 0.4618 | 0.4491 | 0.5312 | 0.4700 | 0.4854 | 0.6157 |
poker-8_vs_6 | 0.5252 | 0.7410 | 0.7299 | 0.5740 | 0.7318 | 0.5237 | 0.7098 | 0.6536 | 0.6790 | 0.7492 | 0.7008 | 0.7129 | 0.7190 | 0.7152 | 0.7733 |
banknote | 0.9876 | 0.9810 | 0.9809 | 0.9913 | 0.9804 | 0.9821 | 0.9856 | 0.9830 | 0.9777 | 0.9822 | 0.9896 | 0.9811 | 0.9834 | 0.9791 | 0.9850 |
ctgC5 | 0.5134 | 0.6658 | 0.6233 | 0.5730 | 0.6419 | 0.5099 | 0.5858 | 0.5736 | 0.6297 | 0.6419 | 0.6322 | 0.6785 | 0.6412 | 0.6430 | 0.6859 |
ctgN1vsN3 | 0.8297 | 0.8170 | 0.8254 | 0.8137 | 0.8350 | 0.8445 | 0.8454 | 0.8361 | 0.8299 | 0.8317 | 0.8471 | 0.8360 | 0.8338 | 0.8415 | 0.8447 |
ctgN2 | 0.8052 | 0.7969 | 0.8011 | 0.7994 | 0.8138 | 0.7876 | 0.8452 | 0.8092 | 0.7956 | 0.8004 | 0.8123 | 0.8223 | 0.7964 | 0.8558 | 0.8770 |
ctgN3 | 0.8211 | 0.8732 | 0.8536 | 0.8341 | 0.8840 | 0.8109 | 0.8444 | 0.8277 | 0.8309 | 0.8650 | 0.8233 | 0.8766 | 0.8427 | 0.8501 | 0.8766 |
wilt | 0.7691 | 0.7038 | 0.6852 | 0.7222 | 0.7001 | 0.7559 | 0.7981 | 0.7226 | 0.7028 | 0.8006 | 0.7304 | 0.7578 | 0.7449 | 0.7611 | 0.7999 |
mfeatmor01 | 0.9521 | 0.9815 | 0.9874 | 0.9662 | 0.9901 | 0.9633 | 0.9884 | 0.9601 | 0.9732 | 0.9807 | 0.9874 | 0.9663 | 0.9406 | 0.9535 | 0.9715 |
mfeatmor012 | 0.8998 | 0.8227 | 0.8376 | 0.9015 | 0.8778 | 0.9192 | 0.9135 | 0.9004 | 0.8858 | 0.9030 | 0.8564 | 0.9002 | 0.8568 | 0.8774 | 0.9323 |
seeds2v13 | 0.9855 | 0.9724 | 0.9789 | 0.9802 | 0.9693 | 0.9891 | 0.9868 | 0.9780 | 0.9876 | 0.9905 | 0.9872 | 0.9901 | 0.9701 | 0.9743 | 0.9856 |
segment1 | 0.9423 | 0.9637 | 0.9716 | 0.9332 | 0.9630 | 0.9236 | 0.9579 | 0.9411 | 0.9619 | 0.9690 | 0.9848 | 0.9597 | 0.9481 | 0.9577 | 0.9700 |
segment12 | 0.9701 | 0.9539 | 0.9611 | 0.9678 | 0.9555 | 0.9690 | 0.9772 | 0.9571 | 0.9580 | 0.9747 | 0.9689 | 0.9677 | 0.9704 | 0.9708 | 0.9882 |
segment123 | 0.9991 | 0.9897 | 0.9879 | 0.9994 | 0.9834 | 0.9990 | 1.0000 | 0.9995 | 0.9994 | 0.9989 | 0.9881 | 1.0000 | 0.9935 | 0.9991 | 1.0000 |
Rank | Combination | # data sets performing best | |
1 | MPBU + Support Vector Machine | 7 | |
2 | UFFDFR + Logistic Regression | 5 | |
3 | MWMOTE + Extreme Learning Machine | 4 | |
4 | MWMOTE + Support Vector Machine | 3 | |
5 | UFFDER + CART | 3 | |
6 | GA-SMOTE + Support Vector Machine | 2 | |
7 | MPBU + KNN | 2 | |
8 | MWMOTE + Support Vector Machine | 2 | |
9 | GA-SMOTE + Extreme Learning Machine | 2 | |
10 | MPBU + Extreme Learning Machine | 2 |
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_RUS | 15 | 26 | 1 | 14 | 25 | 3 |
SEHE vs. SEHE_UFFDFR | 13 | 28 | 1 | 14 | 28 | 0 |
SEHE vs. SEHE_MPBU | 11 | 29 | 2 | 15 | 26 | 1 |
SEHE vs. SEHE_ROS | 10 | 31 | 1 | 12 | 28 | 2 |
SEHE vs. SEHE_SMOTE | 7 | 33 | 2 | 9 | 29 | 4 |
SEHE vs. SEHE_MWMOTE | 6 | 34 | 2 | 10 | 29 | 3 |
SEHE vs. SEHE_GA-SMOTE | 6 | 32 | 4 | 8 | 33 | 1 |
Ablation comparison | F1-measure | G-mean | ||||
Win | Tie | Loss | Win | Tie | Loss | |
SEHE vs. SEHE_CART | 17 | 25 | 0 | 19 | 22 | 1 |
SEHE vs. SEHE_KNN | 20 | 21 | 1 | 23 | 19 | 0 |
SEHE vs. SEHE_SVM | 15 | 22 | 5 | 17 | 22 | 3 |
SEHE vs. SEHE_ELM | 13 | 26 | 3 | 15 | 24 | 3 |
SEHE vs. SEHE_LR | 18 | 24 | 0 | 21 | 20 | 1 |
SEHE vs. SEHE_NB | 25 | 17 | 0 | 23 | 18 | 1 |