Reputation: 279
I am working on a document which should contain the key differences between using Naive Bayes (generative) and Logistic Regression (discriminative) models for text classification.
During my research, I ran into this definition for Naive Bayes model: https://nlp.stanford.edu/IR-book/html/htmledition/naive-bayes-text-classification-1.html
The probability of a document
d
being in classc
is computed as ... wherep(tk|c)
is the conditional probability of termtk
occurring in a document of classc
...
When I got to the part of comparing Generative and Discriminative models, I found this explanation on StackOverflow as accepted: What is the difference between a Generative and Discriminative Algorithm?
A generative model learns the joint probability distribution
p(x,y)
and a discriminative model learns the conditional probability distributionp(y|x)
- which you should read as "the probability of y given x".
At this point I got confused: Naive Bayes is a generative model and uses conditional probabilities, but at the same time the discriminative models were described as if they learned the conditional probabilities as opposed to the joint probabilities of the generative models.
Can someone shed some light on this please?
Thank you!
Upvotes: 9
Views: 6924
Reputation: 4778
It is generative in the sense that you don't directly model the posterior p(y|x)
but rather you learn the model of the joint probability p(x,y)
which can be also expressed as p(x|y) * p(y)
(likelihood times prior) and then through the Bayes rule you seek to find the most probable y.
A good read I can recommend in this context is: "On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes" (Ng & Jordan 2004)
Upvotes: 6