VeilEclipse
VeilEclipse

Reputation: 2856

Difference in results of KL divergence in DIT and Scipy.Stats

I am trying to calculate KL divergence in Python between 2 probability distributions.

First I use the lib dit

from dit.divergences import kullback_leibler_divergence
p = dit.Distribution(['0', '1'], [3/4, 1/4])
q = dit.Distribution(['0', '1'], [1/2, 1/2])
kullback_leibler_divergence(p, q)

This returns 0.1887

If I try to do the same using scipy

from scipy.stats import entropy
p = [3/4, 1/4]
q = [1/2, 1/2]
entropy(p,q)

This returns 0.1308

Why is the difference in results?

Upvotes: 0

Views: 131

Answers (1)

arduinolover
arduinolover

Reputation: 176

logarithm bases are different. dit uses log_2 whereas scipy uses log_e.

Upvotes: 3

Related Questions