tan
tan

Reputation: 1569

KL divergence in R using base 2 logarithm

I want to find JS divergence of two distributions in R. wikipedia says that Jensen–Shannon divergence is bounded by 1, given that one uses the base 2 logarithm. I want that my resulting JS divergence lies between 0 and 1. I am using KLdiv function in R to find JS:

JSD(P || Q)= 1/2*D(P || M) + 1/2*D(Q || M)  

where Kullback–Leibler divergence KLdiv(P,M) = D(P || M)

But I want to specify that I need base 2 logarithm. Looks like KLdiv does not allow me to specify which log I want to use. Any clue as to how to do that?

Ok this is the R code for finding JSdivergence between 2 distributions ..

library(flexmix)
m <- 0.5 *(dist1 + dist2) #JSD(P||Q)=0.5*D(P||M) + 0.5*D(Q||M), where M=0.5*(P+Q)
Dpm <- KLdiv(cbind(dist1,m))
Dqm <- KLdiv(cbind(m,dist2))
js <- 0.5*Dpm + 0.5*Dqm

I want a JS value between 0 and 1 which as per wiki is possible only if I take base 2 logarithm. How can I do this with my exisiting R code

Upvotes: 1

Views: 2010

Answers (1)

fotNelton
fotNelton

Reputation: 3894

Generally speaking, it holds that

enter image description here

meaning that if you want to compute the logarithm of b to base a but you only have a function that computes the logarithm of any number to base x, you can still easily get the logarithm of that number to base a.

Hence:

enter image description here

So if you want to determine the KL divergence with respect to base x, you just have to divide the result of computing the KL divergence with base e by the logarithm of x to base e (or whatever base the implementation of KLdiv is using).

BTW, you forgot to mention which KLdiv function you are using.

Secondly, by looking at your R code I think you might want to re-read on JS divergence, in particular the definition of M.

Upvotes: 2

Related Questions