Kaiming
Kaiming

Reputation: 183

Keras custom loss function yields weird result

I'm trying to write a custom loss function of weighted binary cross-entropy in Keras. However, when I compiled my model with the custom loss function, both of the loss and the accuracy went down. Normally the accuracy is around 90% when I train the model with plain BCE, but it came down to 3-10% when I used my custom loss function. Here is my custom loss function:

def weighted_crossentropy_core(y_true, y_pred, pos_weight, neg_weight):

    wcel = K.binary_crossentropy(y_true, y_pred)
    weight = y_true * pos_weight + (1.0 - y_true) * neg_weight
    wcel = K.mean(weight * cel)

    return wcel

def weighted_crossentropy_wrapper(pos_weight=1, neg_weight=1):
    def weighted_crossentropy_return(y_true, y_pred):
        return weighted_crossentropy_core(y_true, y_pred, pos_weight, neg_weight)
    return weighted_crossentropy_return

wcel = weighted_crossentropy_wrapper()
model.compile(Adam(init_lr), loss=wcel, metrics=["accuracy"])

The most weird thing is that when I wrapped the tf.keras.losses.binary_crossentropy loss function and then sent the wrapper in (which is essentially the same as keras' bce loss), the resulting losses were totally different from those I got when passing keras bce without the wrapper! Here is how I wrap the bce:

def wrapped_bce(y_true, y_pred):

    bce = tf.keras.losses.binary_crossentropy(y_true, y_pred)

    return bce

wcel = wrapped_bce()
model.compile(Adam(init_lr), loss=wcel, metrics=["accuracy"])

Is there anything wrong with the way I wrap my loss function? Thanks in advance!

Upvotes: 1

Views: 219

Answers (1)

Eric Fournie
Eric Fournie

Reputation: 1372

To wrap your loss in the second example, you should use wcel = wrapped_bce (without parenthesis).

Same thing in the first example, but your function weighted_crossentropy_wrapper is a wrapper around another wrapper and omits y_true and y_pred, this looks a bit strange to me.

Upvotes: 0

Related Questions