user1128016
user1128016

Reputation: 1538

Gradients in Keras loss function with RNNs

I have a simple test LSTM model:

inputs = Input(shape=(k, m))
layer1 = LSTM(128, activation='relu', return_sequences=True)(inputs)
layer2 = LSTM(128, activation='relu')(layer1)
predictions = Dense(1, activation='linear')(layer2)
model = Model(inputs=inputs, outputs=predictions)

and a custom loss function that uses output gradients wrt inputs:

def custom_loss(model, input_tensor):
    def loss(y_true, y_pred):
        grads = K.gradients(model.output, model.input)[0] 
        loss_f = losses.mean_squared_error(y_true, y_pred) + K.exp(-K.sum(grads))
        return loss_f

return loss

Model training fails with error "Second-order gradient for while loops not supported":

model.compile(optimizer='adam', loss=custom_loss(model_reg, inputs_reg), metrics=['mean_absolute_error'])
model_reg.fit(x_train, y_train, batch_size=32, epochs=20, verbose=1, validation_data=(x_val, y_val)) 

-----
....
      159 
      160   if op_ctxt.grad_state:
-->   161     raise TypeError("Second-order gradient for while loops not supported.")
      162 
      163   if isinstance(grad, ops.Tensor):

TypeError: Second-order gradient for while loops not supported.

Why TF tries to compute second-order gradients here? It should be just first order.

The same loss function works well for non-RNN models.

Upvotes: 0

Views: 285

Answers (1)

user1128016
user1128016

Reputation: 1538

Setting Unroll property helped to resolve the issue:

layer1 = LSTM(128, activation='relu', return_sequences=True, unroll=True)(inputs)
layer2 = LSTM(128, activation='relu', unroll=True)(layer1)

Upvotes: 1

Related Questions