zale
zale

Reputation: 1279

Keras with TF backend: get gradient of outputs with respect to inputs

I have a very simple Keras MLP, and I'm trying to get the gradient of the outputs with respect to the inputs.

I'm using the following code:

regressor = Sequential([
    Dense(32, input_shape=(n_features,), activation='relu'),
    Dense(1)
])
regressor.compile(optimizer=SGD(lr=0.1), loss='mse')

regressor.fit(x, y)

output_tens = regressor.layers[-1].output
input_tens = regressor.layers[0].input

grad = tf.gradients(output_tens, input_tens)
with tf.Session() as sess:
    sess.run(grad, feed_dict={input_tens: np.zeros((1, n_features))})

Which fails with the following error

FailedPreconditionError: Attempting to use uninitialized value dense_7/bias
     [[Node: dense_7/bias/read = Identity[T=DT_FLOAT, _class=["loc:@dense_7/bias"], _device="/job:localhost/replica:0/task:0/cpu:0"](dense_7/bias)]]

(The stack trace is long and, I assume, not very informative, so I'm not adding it here).

Is my approach basically correct? Is there anything special I have to do?

Thanks!

Upvotes: 3

Views: 1232

Answers (1)

Marcin Możejko
Marcin Możejko

Reputation: 40516

You need to get your keras session in order to make it work:

import keras.backend as K

with K.get_session() as sess:
    sess.run(grad, feed_dict={input_tens: np.zeros((1, n_features))})

When you instantiate a new session - you don't have initialized variables from keras training.

Upvotes: 1

Related Questions