Skip to content

Best Model Selection

Performance Analysis

After training three gradient boosting models with hardcoded hyperparameters, comparative performance analysis reveals consistent results across all implementations. Each model achieved AUC scores ranging from 0.96 to 0.966, indicating strong predictive capability.

Model Comparison Graph

Model Comparison Results

The evaluation demonstrates similar performance metrics across all three model types, with marginal differences in AUC scores:

Model Comparison List

Champion Model Selection

Based on the initial evaluation results, the best performing model was identified as:

Run ID: f2c64e293c9246fa904dd6f66bce8c9f (treasured-mole-567)
Model Type: XGBoost
AUC Score: 0.966

This model represents the current champion configuration for the bank client subscription prediction task.

Best Model Performance

Detailed metrics and parameters for the selected champion model:

Best Model Performance Metrics

While all models demonstrate strong performance within a narrow AUC range (0.96-0.966), the selected XGBoost model achieved the highest validation AUC score.