kevin.w.johnson
kevin.w.johnson

Reputation: 1794

Custom Loss function in Keras Value Error

I am trying to predict 4 variables, but what I really care about is the sum of the variables over what any particular variable is. With that in mind, I've defined my model like:

import keras.backend as K
from keras import losses

def baseline_model():
    def loss_function(y_actual, y_predict):
        return losses.mean_squared_error(
            K.sum(y_actual), 
            K.sum(y_predict)
        )

    model = Sequential()
    model.add(Dense(60, input_dim=40, kernel_initializer='normal', activation='relu'))
    model.add(Dense(30, kernel_initializer='normal', activation='relu'))
    model.add(Dense(5, kernel_initializer='normal'))
    model.compile(loss=loss_function, optimizer='adam')

    return model

Where I'm trying to return the mean squared error of the sum. I'm getting an error saying:

ValueError: Invalid reduction dimension -1 for input with 0 dimensions. for 'loss_10/dense_33_loss/loss_function/Mean' (op: 'Mean') with input shapes: [], [] and with computed input tensors: input[1] = <-1>.

Any idea how I can properly accomplish this or where I'm going wrong?

Upvotes: 0

Views: 604

Answers (1)

Mark Snyder
Mark Snyder

Reputation: 1665

K.sum reduces the rank of the input tensor by default and computes the sum over all axes by default. The combination of these two defaults is causing your issue. All you need to do is set keepdims=True for each use of K.sum, or alternatively set axis=1 (or whichever axis you want to sum over).

def loss_function(y_actual, y_predict):
        return losses.mean_squared_error(
            K.sum(y_actual,keepdims=True), 
            K.sum(y_predict,keepdims=True)
        )

def loss_function(y_actual, y_predict):
        return losses.mean_squared_error(
            K.sum(y_actual,axis=1), 
            K.sum(y_predict,axis=1)
        )

Both should compile just fine. You probably want the keepdims version, though.

Upvotes: 1

Related Questions