Confusion Matrix
Interpretation Aid
The confusion matrix shows how well the model classifies observations into the two outcome categories. The rows represent the actual outcomes (0 or 1) and columns represent the predicted outcomes. Diagonal cells (true positives and true negatives) indicate correct predictions; off-diagonal cells show errors. Adjust the classification threshold to balance sensitivity (correctly identifying 1s) versus specificity (correctly identifying 0s) based on your business priorities.
Classification Metrics: Accuracy = overall % correct. Sensitivity (Recall) = % of actual 1s correctly identified. Specificity = % of actual 0s correctly identified. Precision (PPV) = % of predicted 1s that are actually 1. F1 Score = harmonic mean of precision and recall (balances both). NPV = % of predicted 0s that are actually 0.