Vision
Vision

Reputation: 588

Combine cross-entropy and mse in loss function

I am working on a regression problem. My dataset has labels ranging from [0,1]. Due to the design purpose, the label with the value over 0.3 is converted to the negative, i.e., 0.35 is converted to -0.35.

In keras, I first tried mse as the loss function, but the performance is not good. After I realize the sign of labels, I tried binary cross-entropy as well. But the performance is still not good.

As I explained above, it seems we can utilize two loss functions and sum them up. But I don't know how to write the code. Besides, if you have any other suggestion for this specific dataset, please let me know.

Upvotes: 3

Views: 7160

Answers (2)

user3731622
user3731622

Reputation: 5095

You might want to use the Keras functional api to build a multi output model.

You could create one output for the classification part of the model and one output for the regression part of the model. (FYI, in literature these are referred to as the classification head and regression head of the CNN.)

Then you can specify the loss functions for each of the outputs.

You can also weight each loss function (i.e. set weights for linear combination of losses of each of the models outputs).

This type of multi output model is explained in Keras functional api guide. Read through the link and pay attention to the section Multi-input and multi-output models

Upvotes: 1

Julio Daniel Reyes
Julio Daniel Reyes

Reputation: 6365

You can create your own loss function, checkout keras documentation and source code for ideas, but it should be something like this:

from keras.losses import mean_squared_error, binary_crossentropy

def my_custom_loss(y_true, y_pred):
    mse = mean_squared_error(y_true, y_pred)
    crossentropy = binary_crossentropy(y_true, y_pred)
    return mse + crossentropy

...

model.compile(loss=my_custom_loss, ...)

Also checkout the backend API to use primitives if you need basic tensor operations

Upvotes: 6

Related Questions