Figure 4.
The relationship between the number of features in the models and their performances. All of the models were trained using the SMOTE data set and their features increased based on RFE algorithm. It shows that each model achieves a stable and high recall as well as a stable and low FPR when the number of features is large enough. To be stable, the optimized number of features can be 4 for Adaboost and Gboost, 5 for DT, and 6 for RF and XGboost.