Jonathan
Jonathan

Reputation: 1936

Weighted hinge loss function

I am defining my vanilla hinge loss as:

def hinge(y_true, y_pred):    
    return tf.maximum(0., 1- y_true*y_pred)

I'm training an SVM to predict an event. And my y values are 1 if it's that class or -1 if it's not that class. My class is imbalanced that I have much more -1 than +1.

Hence, I'd like to weight the loss +1 to be higher penalty. But I'm not how exactly I'd change my hinge loss. The best I can think of is,

X = (# of non event samples) / (# of event samples)
if(y_true*y_pred > 0):
    return tf.maximum(0., 1- y_true*y_pred)*X

This means, if I have a 100 non-event and 10 event, then X = 100/10 = 10. The loss is then hingeloss*10 if y_true = 1.

Is this right or is there a better way to do this?

Upvotes: 0

Views: 767

Answers (1)

Juan Carlos Ramirez
Juan Carlos Ramirez

Reputation: 2129

How about:

def hinge(y_true, y_pred):    
    return tf.multiply((11/9+y_true)*9/2,tf.maximum(0., 1- y_true*y_pred))

The logic here is that we want to multiply by 10 if y_true is 1, and by 1 if it is -1 . You can check that when y_true is -1, (11/9+y_true)*9/2 evaluates to 1, and when it is 1, it evaluates to 10. If you are interested in knowing how to derive the 11/9 in the expession, it is the result of solving the linear equation for the desired shift s:

10(s + (-1))=1(s+1).

Upvotes: 1

Related Questions