Fokke
Fokke

Reputation: 81

How to define sample weights within custom loss function?

I require to classify binary sequences, where I want to have assign higher weight to later parts of the sequence.

For example, I want to assign:

sequence      -- weights for the samples in this sequence
[0,0,0,1,1,1] -- [1,1,1,2,3,4]

Input is in shape m x n, thus m samples of length n.

I want to assign weights within a custom loss function, as I want to learn how this works (and by using sample_weight I get all kinds of dimensionality problems).

Now I use the following loss function:

def weightedLoss(weight):
    def binaryPart(yTrue,yPred):
        return K.mean(loss.binary_crossentropy(yTrue,yPred)*weight)  
    return binaryPart

With weight being a m x n matrix containing the sample weights. The error message I get is: (m=20000,n=63)

InvalidArgumentError: Incompatible shapes: [64] vs. [20000,63] [[{{node loss_39/dense_120_loss/mul}}]]

I do not understand what this error means. dense_120 is here my output layer, giving as output shape (None,63).

I feel like there is an error in the definition of my loss function, what am I missing?

Upvotes: 3

Views: 620

Answers (1)

Mohamad Zeina
Mohamad Zeina

Reputation: 433

As Daniel said, your output layer should contain 64 neurons, but it looks like it currently contains 63. This should fix your error.

To answer your question regarding sample weights, the Keras .fit method already takes an argument for sample_weights, so you can use this without creating a custom loss function.

fit(x=None, y=None, batch_size=None, epochs=1, verbose=1, callbacks=None, validation_split=0.0, validation_data=None, shuffle=True, class_weight=None, sample_weight=None, initial_epoch=0, steps_per_epoch=None, validation_steps=None, validation_freq=1)

Upvotes: 1

Related Questions