Blue
Blue

Reputation: 663

Keras Tensorflow Binary Cross entropy loss greater than 1

Library: Keras, backend:Tensorflow

I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. I am compiling my model with a binary cross entropy loss. When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere? I have checked the labels. They're all 0s and 1s.

Is it possible to have the binary cross entropy loss greater than 1?

Upvotes: 7

Views: 15979

Answers (2)

Dr. Snoopy
Dr. Snoopy

Reputation: 56367

Yes, its correct, the cross-entropy is not bound in any specific range, its just positive (> 0).

Upvotes: 9

Y. Luo
Y. Luo

Reputation: 5732

Keras binary_crossentropy first convert your predicted probability to logits. Then it uses tf.nn.sigmoid_cross_entropy_with_logits to calculate cross entropy and return to you the mean of that. Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1])).

Upvotes: 10

Related Questions