Reputation: 115
I'm trying to rewrite a Keras graph into a Tensorflow graph, but wonder which loss function is the equivalent of "Binary Cross Entropy". Is it tf.nn.softmax_cross_entropy_with_logits_v2?
Thanks a lot!
Upvotes: 1
Views: 6156
Reputation: 3974
No, the implementation of the binary_crossentropy
with tensorflow backend is defined here as
@tf_export('keras.backend.binary_crossentropy')
def binary_crossentropy(target, output, from_logits=False):
"""Binary crossentropy between an output tensor and a target tensor.
Arguments:
target: A tensor with the same shape as `output`.
output: A tensor.
from_logits: Whether `output` is expected to be a logits tensor.
By default, we consider that `output`
encodes a probability distribution.
Returns:
A tensor.
"""
# Note: nn.sigmoid_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
if not from_logits:
# transform back to logits
epsilon_ = _to_tensor(epsilon(), output.dtype.base_dtype)
output = clip_ops.clip_by_value(output, epsilon_, 1 - epsilon_)
output = math_ops.log(output / (1 - output))
return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
Therefore, it uses sigmoid_crossentropy
and not softmax_crossentropy
.
Upvotes: 4