R.K
R.K

Reputation: 67

tf.nn.softmax_cross_entropy_with_logits is giving wrong output. Why?

I want to calculate cross entropy function with the following command. This command first apply sofmax to the logits and then apply cross entropy function. According to the formula of cross entropy sum(b*loga) where b is correct label and a is the predicted label. Since sum is taken over all classes in cross entropy formula so I expect that the multiplication goes on like this. If after the sofmax operation output is

 [0.45186275  0.27406862  0.27406862]

Now if I apply cross entropy it should be like this

 ( 1 * log 0.45186275 + 0 * log 0.27406862 + 0 * log 0.27406862 ) 

My output is 0.794377 but I'm expecting a different

a = tf.constant([0.9,0.4,0.4])
b = tf.constant([1,0,0])
result =  tf.nn.softmax_cross_entropy_with_logits(logits=a,labels=b)
sess = tf.InteractiveSession()
sess.run(tf.initialize_all_variables())
print result.eval()

Upvotes: 0

Views: 1308

Answers (1)

Anton Panchishin
Anton Panchishin

Reputation: 3773

I believe that tf.nn.softmax_cross_entropy_with_logits calculates sum(-bLog(softmax(a)) )-ish and the desired output is 0.794377. If you don't include the negative sign when you do it by hand then you'll get -0.794377, which will work as a loss term but you'll have to maximize your loss, not minimize it, when training.

It is also important to note that TF uses natural log, and not log base 10.

Upvotes: 1

Related Questions