user2276280
user2276280

Reputation: 601

Calculating conditional entropy for a decision tree

I'm trying to calculate conditional entropy in order to calculate information gain for decision trees. I'm having a little trouble with the implementation in Java. An example may look like:

 X   Y  f(x)   
 1   0   A
 1   0   A
 0   1   B

Given this example, how would I go about calculating conditional entropy in Java? I understand the math behind it but am just confused on the implementation.

An example can be found here: http://en.wikipedia.org/wiki/Conditional_entropy

Upvotes: 1

Views: 1741

Answers (1)

Stefan D
Stefan D

Reputation: 1259

Conditional entropy for variable Y:

(Probability of Y = 0)(Entropy of f(x) when Y=0) + (Prob. of Y = 1)(Entropy of f(x) when Y=1)

In your example:

(2/3)(-1(2/2*log(2)) + (1/3)*(-1(1/1*log(1)) = (2/3)*0 + (1/3)*0 = 0

i.e. this is a bad example because your conditional entropy is always 0. May be this will help: http://www.onlamp.com/pub/a/php/2005/03/24/joint_entropy.html?page=3

Upvotes: 0

Related Questions