Reputation: 73
I'm learning about various loss functions used in Deep learning. I needed some help implementing a custom loss function in tensorflow. To get a concrete picture of this, I would like to implement a custom Binary Cross Entropy loss as an example.
Thanks a lot for your help
Regards
Edit: The following is loss function I have implemented:
def custom_loss(eps):
def loss(y_true, y_pred):
ans = -eps*(y_true*tf.log(y_pred) + (1-y_true)*tf.log(y_pred))
return ans
return loss
This is returning not a number after sometime. I tried to add a small quantity to the log function. Furthermore, I have changed the optimiser to adam.
Upvotes: 3
Views: 2519
Reputation: 14485
I think this is a problem with numerical computation whenever y_pred == 0
.
Note that log(0)
is undefined so, in order to make our loss calculations numerically stable, we tend to do tf.log(y_pred + epsilon)
where epsilon
is a very small number that will have a negligible effect on the loss but avoid returning a NaN when trying to divide by zero (or do log(0)).
I assume that this is what you were aiming for with the eps
parameter but you ought to put it inside the call to tf.log()
.
Perhaps something like this:
def custom_loss(eps):
def loss(y_true, y_pred):
ans = -(y_true*tf.log(y_pred + eps) + (1-y_true)*tf.log(y_pred + eps))
return ans
return loss
Upvotes: 4