Figure 4.
The relationship between the number of features in the models and their performances. All of the models were trained using the SMOTE data set and their features increased based on RFE algorithm. It shows that each model achieves a stable and high recall as well as a stable and low FPR when the number of features is large enough. To be stable, the optimized number of features can be 4 for Adaboost and Gboost, 5 for DT, and 6 for RF and XGboost.

The relationship between the number of features in the models and their performances. All of the models were trained using the SMOTE data set and their features increased based on RFE algorithm. It shows that each model achieves a stable and high recall as well as a stable and low FPR when the number of features is large enough. To be stable, the optimized number of features can be 4 for Adaboost and Gboost, 5 for DT, and 6 for RF and XGboost.

Close
This Feature Is Available To Subscribers Only

Sign In or Create an Account

Close

This PDF is available to Subscribers Only

View Article Abstract & Purchase Options

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Close