### What are the Pros/Cons of Naive Bayes?

Pros: Computational efficiency

Cons:Independence assumption is not realistic for many data sets

Pros: Computational efficiency

Cons:Independence assumption is not realistic for many data sets

Gaussian Naive Bayes accounts for continuous features by calculating the conditional likelihood rather than conditional probability of an observation belonging to each class

If a feature appears zero times within a particular class, the computed likelihood score for an observation belonging to that class will be zero

Naive Bayes uses the framework of Bayes Theorem and the assumption of conditional independence between all pairs of predictors

In some classification contexts, it might be more of interest to obtain predicted probabilities of class membership rather than simply the labels themselves.

Logistic regression is the most traditional classification algorithm and preserves many of the advantages in interpretation as linear regression for a continuous outcome.

**Partner Ad**