ALjazzguitar
ALjazzguitar

Reputation: 107

how to prevent negative predictions in keras custom loss function

I'm using a custom loss function:

def ratio_loss(y, y0):
   return (K.mean(K.abs(y - y0) / y))

and get negative predicted values - which in my case doesn't makes scene (I use CNN and regression as last layer to get a length of an object). I used division in order to penalize more where the true value is relative small to the predicted).

how can i prevent the negative predictions ?

this is the mode (for now..):

def create_model():
    model = Sequential()
    model.add(Conv2D(128, kernel_size=(3, 3), activation='relu', padding='same', input_shape=(128, 128, 1)))
    model.add(Dropout(0.5))

    model.add(Conv2D(128, kernel_size=(3, 3), activation='relu', padding='same'))
    model.add(Dropout(0.25))
    model.add(BatchNormalization())
    model.add(MaxPooling2D(pool_size=(2, 2)))

    model.add(Conv2D(64, kernel_size=(3, 3), activation='relu', padding='same'))
    model.add(Dropout(0.25))
    model.add(BatchNormalization())
    model.add(MaxPooling2D(pool_size=(2, 2)))
    #
    #
    model.add(Conv2D(64, kernel_size=(3, 3), activation='relu', padding='same'))
    model.add(Dropout(0.25))
    model.add(BatchNormalization())
    model.add(MaxPooling2D(pool_size=(2, 2)))

    model.add(Flatten())
    model.add(Dense(512, activation='relu'))

    model.add(Dropout(0.15))
    model.add(Dense(1))
    #model.compile(loss=keras.losses.mean_squared_error, optimizer=keras.optimizers.Adadelta(), metrics=[sacc])
    model.compile(loss=ratio_loss, optimizer=keras.optimizers.Adadelta(), metrics=[sacc])
    return model

Thanks, Amir

Upvotes: 2

Views: 6486

Answers (2)

Daniel Möller
Daniel Möller

Reputation: 86600

def ratio_loss(y, y0):
    return (K.mean(K.abs(y - y0 / y)))

But what is the range of your expected output?

You should probably be using some activation function at the end such as:

  • activation ='sigmoid' - from 0 to 1
  • activation = 'tanh' - from -1 to +1
  • activation = 'softmax' - if it's a classification problem with only one correct class
  • actication = 'softplus' - from 0 to +inf.
  • etc.

Usage in the last layer:

model.add(Dense(1,activation='sigmoid')) #from 0 to 1

#optional, from 0 to 200 after using the sigmoid above
model.add(Lambda(lambda x: 200*x))

Hint: if you're a starter, avoid using too much "relu", it often gets stuck in 0 and must be used with carefully selected learning rates.

Upvotes: 1

Gerry
Gerry

Reputation: 121

You could continue training your neural network, and hopefully it will learn not to make any prediction below 0 (assuming all of the training data has output below 0). You could then add a post-prediction step where you turn an And if it makes any predictions below 0, then you can just convert it to 0.

You could add an activation function as Daniel Möller answered.

That would involve changing

model.add(Dense(1))

to

model.add(Dense(1, activation='softplus'))

since you mentioned you wanted the output to be from 0 to ~200 in a comment. This would guarantee there's not output below 0.

Upvotes: 2

Related Questions