Confusion Matrix (from your image)

1. Confusion Matrix (from your image)

True \ Pred 0 1 2
0 3 2 1
1 3 2 1
2 2 4 2

So for example: the number 2 at row = 1, col = 1 means: 2 samples actually belonged to class 1, and the model correctly predicted them as class 1.


2. Precision Formula

For a given class (say class 1):

Precision=True Positives (TP)True Positives (TP)+False Positives (FP)\text{Precision} = \frac{\text{True Positives (TP)}}{\text{True Positives (TP)} + \text{False Positives (FP)}}
  • TP (True Positive for class 1): Predicted = 1 AND Actual = 1

  • FP (False Positive for class 1): Predicted = 1 BUT Actual ≠ 1


3. Find TP and FP for Class 1

Look at column 1 (Predicted = 1):

  • Row 0, Col 1 = 2 → (Predicted 1, but actually 0) → False Positive

  • Row 1, Col 1 = 2 → (Predicted 1, and actually 1) → True Positive

  • Row 2, Col 1 = 4 → (Predicted 1, but actually 2) → False Positive

So:

  • TP = 2

  • FP = 2 + 4 = 6


4. Calculate Precision

Precision for class 1=TPTP+FP=22+6=28=0.25\text{Precision for class 1} = \frac{TP}{TP + FP} = \frac{2}{2+6} = \frac{2}{8} = 0.25

Answer = 0.25 (25%)


👉 Would you like me to also draw a simple diagram with arrows on the confusion matrix to show exactly which numbers are TP and FP for class 1? That way you can visualize it better.

Comments

Popular posts from this blog

Understanding Data Leakage in Machine Learning: Causes, Examples, and Prevention

🌳 Understanding Maximum Leaf Nodes in Decision Trees (Scikit-Learn)

Linear Regression with and without Intercept: Explained Simply