Naïve Bayes assumes

 Great question 👍

Let’s analyze each statement:

  1. Naïve Bayes assumes that features are conditionally independent given the class label.”
    ✅ Correct — this is the fundamental assumption of Naïve Bayes.

  2. Gaussian Naïve Bayes is suitable for datasets with continuous features.”
    ✅ Correct — Gaussian NB models continuous features using a normal distribution.

  3. Multinomial Naïve Bayes is commonly used for text classification tasks.”
    ✅ Correct — It works well with word counts / term frequencies in NLP.

  4. “Naïve Bayes always outperforms logistic regression on all datasets.”
    ❌ Incorrect — Performance depends on the dataset; logistic regression often outperforms NB when features are correlated.


Answer: The incorrect statement is the last one:
“Naïve Bayes always outperforms logistic regression on all datasets.”

Comments

Popular posts from this blog

Understanding Data Leakage in Machine Learning: Causes, Examples, and Prevention

🌳 Understanding Maximum Leaf Nodes in Decision Trees (Scikit-Learn)

Linear Regression with and without Intercept: Explained Simply