📘 Blog: Understanding Mean Absolute Error (MAE)
🔹 Question Recap
from sklearn.metrics import mean_absolute_error
y_true = [2, 0, 3, 5]
y_pred = [2.5, 0.0, 2, 8]
mean_absolute_error(y_true, y_pred)
Options:
-
1.25
-
2.56
-
4.25
-
✅ 1.12
-
Error
✅ Step 1: What is MAE?
Mean Absolute Error (MAE) is a regression evaluation metric.
Formula:
-
It calculates the absolute difference between actual and predicted values.
-
Then takes the average of these differences.
-
Smaller MAE → better performance.
✅ Step 2: Apply Formula
| Index | y_true | y_pred | Error = |y_true - y_pred| |
|-------|--------|--------|---------------------------------|
| 0 | 2 | 2.5 | 0.5 |
| 1 | 0 | 0.0 | 0.0 |
| 2 | 3 | 2 | 1.0 |
| 3 | 5 | 8 | 3.0 |
Now sum errors:
Divide by total samples (4):
Answer = 1.12 (approx) ✅
✅ Step 3: Why Absolute Error?
-
If we used raw differences, positives and negatives would cancel out.
-
Example: if one prediction is too high by +5 and another too low by -5, average error would look zero, which is misleading.
-
Absolute ensures every error counts positively.
✅ Step 4: What Students Should Also Know
-
Other Error Metrics (MCQ variations often test these):
-
MSE (Mean Squared Error):
Squares errors → penalizes larger mistakes more. -
RMSE (Root Mean Squared Error):
Same as MSE but square root → interpretable in original scale. -
R² (Coefficient of Determination):
Measures goodness of fit (closer to 1 is better).
-
-
-
MAE: Good when you want robustness to outliers.
-
MSE / RMSE: Good when large errors are very costly (e.g., predicting house prices).
-
-
Scaling & Interpretation:
-
MAE is in same units as target variable → intuitive for real-world problems.
-
Example: If predicting delivery times in minutes, MAE = 3 means model is off by 3 minutes on average.
-
-
-
Have them manually compute error for a small dataset.
-
Then run
mean_absolute_error()in sklearn. -
They see math vs library output consistency.
-
✅ General Tip for Similar Questions
Whenever you see:
from sklearn.metrics import ...
👉 Immediately think:
-
Which metric is used? (MAE, MSE, RMSE, Accuracy, Precision, Recall, F1).
-
Formula for that metric.
-
Apply step-by-step manually before trusting options.
🎯 Key Takeaway
-
MAE = 1.12 in this case.
-
Always remember: MAE → "average absolute error per prediction".
-
Understand when to prefer MAE vs MSE in real-world projects.
👉 Now, would you like me to prepare the next blog on MSE (Mean Squared Error) with a similar MCQ example, so your students can compare MAE vs MSE directly?
Comments
Post a Comment