hakaishinbeerus
hakaishinbeerus

Reputation: 327

Using Tensorflow Huber loss in Keras

I am trying to use huber loss in a keras model (writing DQN), but I am getting bad result, I think I am something doing wrong. My is code is below.

model = Sequential()
model.add(Dense(output_dim=64, activation='relu', input_dim=state_dim))
model.add(Dense(output_dim=number_of_actions, activation='linear'))
loss = tf.losses.huber_loss(delta=1.0)
model.compile(loss=loss, opt='sgd')
return model

Upvotes: 18

Views: 22048

Answers (5)

Clap
Clap

Reputation: 1

To anyone still wondering about this: In tensorflow 2.0, you can do it in the following way:

model.compile(optimizer=custom_optimizer, #add your optimizer
loss='huber') 

Upvotes: 0

Val
Val

Reputation: 355

How about:

    loss=tf.keras.losses.Huber(delta=100.0)

Upvotes: 5

Chris Marciniak
Chris Marciniak

Reputation: 151

You can wrap Tensorflow's tf.losses.huber_loss in a custom Keras loss function and then pass it to your model.

The reason for the wrapper is that Keras will only pass y_true, y_pred to the loss function, and you likely want to also use some of the many parameters to tf.losses.huber_loss. So, you'll need some kind of closure like:

def get_huber_loss_fn(**huber_loss_kwargs):

    def custom_huber_loss(y_true, y_pred):
        return tf.losses.huber_loss(y_true, y_pred, **huber_loss_kwargs)

    return custom_huber_loss

# Later...
model.compile(
    loss=get_huber_loss_fn(delta=0.1)
    ...
)

Upvotes: 15

benbotto
benbotto

Reputation: 2440

I came here with the exact same question. The accepted answer uses logcosh which may have similar properties, but it isn't exactly Huber Loss. Here's how I implemented Huber Loss for Keras (note that I'm using Keras from Tensorflow 1.5).

import numpy as np
import tensorflow as tf

'''
 ' Huber loss.
 ' https://jaromiru.com/2017/05/27/on-using-huber-loss-in-deep-q-learning/
 ' https://en.wikipedia.org/wiki/Huber_loss
'''
def huber_loss(y_true, y_pred, clip_delta=1.0):
  error = y_true - y_pred
  cond  = tf.keras.backend.abs(error) < clip_delta

  squared_loss = 0.5 * tf.keras.backend.square(error)
  linear_loss  = clip_delta * (tf.keras.backend.abs(error) - 0.5 * clip_delta)

  return tf.where(cond, squared_loss, linear_loss)

'''
 ' Same as above but returns the mean loss.
'''
def huber_loss_mean(y_true, y_pred, clip_delta=1.0):
  return tf.keras.backend.mean(huber_loss(y_true, y_pred, clip_delta))

Depending if you want to reduce the loss or the mean of the loss, use the corresponding function above.

Upvotes: 18

hakaishinbeerus
hakaishinbeerus

Reputation: 327

I was looking through the losses of keras. Apparently logcosh has same properties as huber loss. More details of their similarity can be seen here.

Upvotes: 4

Related Questions