Confusion Matrix (from your image)
1. Confusion Matrix (from your image)
| True \ Pred | 0 | 1 | 2 |
|---|---|---|---|
| 0 | 3 | 2 | 1 |
| 1 | 3 | 2 | 1 |
| 2 | 2 | 4 | 2 |
-
Rows = actual (true) labels
-
Columns = predicted labels
So for example: the number 2 at row = 1, col = 1 means: 2 samples actually belonged to class 1, and the model correctly predicted them as class 1.
2. Precision Formula
For a given class (say class 1):
-
TP (True Positive for class 1): Predicted = 1 AND Actual = 1
-
FP (False Positive for class 1): Predicted = 1 BUT Actual ≠ 1
3. Find TP and FP for Class 1
Look at column 1 (Predicted = 1):
-
Row 0, Col 1 = 2 → (Predicted 1, but actually 0) → False Positive
-
Row 1, Col 1 = 2 → (Predicted 1, and actually 1) → True Positive
-
Row 2, Col 1 = 4 → (Predicted 1, but actually 2) → False Positive
So:
-
TP = 2
-
FP = 2 + 4 = 6
4. Calculate Precision
✅ Answer = 0.25 (25%)
👉 Would you like me to also draw a simple diagram with arrows on the confusion matrix to show exactly which numbers are TP and FP for class 1? That way you can visualize it better.
Comments
Post a Comment