Starnetter
Starnetter

Reputation: 847

Apply tensorflow gradients to specific inputs

I am trying to create a Jacobian matrix for certain output vairables with respect to specific input features in a keras model. For instance If I have a model with 100 input features and 10 output variables and I want to create a Jacobian of outputs 2, 3, and 4 with respect to outputs 50-70, I can create the jacobian like this:

from keras.models import Model
from keras.layers import Dense, Input
import tensorflow as tf
import keras.backend as K
import numpy as np

input_ = Input(shape=(100,))
output_ = Dense(10)(input_)

model = Model(input_,output_)

x_indices = np.arange(50,70)
y_indices = [2,3,4]

y_list = tf.unstack(model.output[0])

x = np.random.random((1,100))

jacobian_matrix = []
for i in y_indices:
    J = tf.gradients(y_list[i], model.input)
    jacobian_func = K.function([model.input, K.learning_phase()], J)
    jac = jacobian_func([x, False])[0][0,x_indices]
    jacobian_matrix.append(jac)
jacobian_matrix = np.array(jacobian_matrix)

but with a much more complex model, this is extremely slow. I only want to create the Jacobian functions above with respect to the inputs of interest. I tried something like this:

from keras.models import Model
from keras.layers import Dense, Input
import tensorflow as tf
import keras.backend as K
import numpy as np

input_ = Input(shape=(100,))
output_ = Dense(10)(input_)

model = Model(input_,output_)

x_indices = np.arange(50,60)
y_indices = [2,3,4]

y_list = tf.unstack(model.output[0])
x_list = tf.unstack(model.input[0])

x = np.random.random((1,100))

jacobian_matrix = []
for i in y_indices:
    jacobian_row = []
    for j in x_indices:
        J = tf.gradients(y_list[i], x_list[j])
        jacobian_func = K.function([model.input, K.learning_phase()], J)
        jac = jacobian_func([x, False])[0][0,:]
        jacobian_row.append(jac)
    jacobian_matrix.append(jacobian_row)

jacobian_matrix = np.array(jacobian_matrix)

and got the Error:

TypeErrorTraceback (most recent call last)
<ipython-input-33-d0d524ad0e40> in <module>()
     23     for j in x_indices:
     24         J = tf.gradients(y_list[i], x_list[j])
---> 25         jacobian_func = K.function([model.input, K.learning_phase()], J)
     26         jac = jacobian_func([x, False])[0][0,:]
     27         jacobian_row.append(jac)

/opt/conda/lib/python2.7/site-packages/keras/backend/tensorflow_backend.pyc in function(inputs, outputs, updates, **kwargs)
   2500                 msg = 'Invalid argument "%s" passed to K.function with TensorFlow backend' % key
   2501                 raise ValueError(msg)
-> 2502     return Function(inputs, outputs, updates=updates, **kwargs)
   2503 
   2504 

/opt/conda/lib/python2.7/site-packages/keras/backend/tensorflow_backend.pyc in __init__(self, inputs, outputs, updates, name, **session_kwargs)
   2443         self.inputs = list(inputs)
   2444         self.outputs = list(outputs)
-> 2445         with tf.control_dependencies(self.outputs):
   2446             updates_ops = []
   2447             for update in updates:

/opt/conda/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in control_dependencies(control_inputs)
   4302   """
   4303   if context.in_graph_mode():
-> 4304     return get_default_graph().control_dependencies(control_inputs)
   4305   else:
   4306     return _NullContextmanager()

/opt/conda/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in control_dependencies(self, control_inputs)
   4015       if isinstance(c, IndexedSlices):
   4016         c = c.op
-> 4017       c = self.as_graph_element(c)
   4018       if isinstance(c, Tensor):
   4019         c = c.op

/opt/conda/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in as_graph_element(self, obj, allow_tensor, allow_operation)
   3033 
   3034     with self._lock:
-> 3035       return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
   3036 
   3037   def _as_graph_element_locked(self, obj, allow_tensor, allow_operation):

/opt/conda/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in _as_graph_element_locked(self, obj, allow_tensor, allow_operation)
   3122       # We give up!
   3123       raise TypeError("Can not convert a %s into a %s." % (type(obj).__name__,
-> 3124                                                            types_str))
   3125 
   3126   def get_operations(self):

TypeError: Can not convert a NoneType into a Tensor or Operation.

Any ideas? Thanks.

Upvotes: 0

Views: 579

Answers (2)

Starnetter
Starnetter

Reputation: 847

In case anyone is wanting a full solution regarding @DomJacks answer:

from keras.models import Model
from keras.layers import Dense, Input, Concatenate
import tensorflow as tf
import keras.backend as K
import numpy as np

num_features = 100
input_ = Input(shape=(num_features,))
output_ = Dense(10)(input_)

model = Model(input_,output_)

# input range of interest
x_range = [50,70]
# output indices of interest
y_indices = [2,3,4]

# If model is saved, you can load using: 
#model = keras.models.load_model(filepath)
# then grab the input:
input_ = model.input

# Split inputs
uninteresting, interesting, more_uninteresting = tf.split(input_, [x_range[0], 
                                                                   x_range[1]-x_range[0], 
                                                                   num_features-x_range[1]], 
                                                          axis=1)
# Create new process
inputs = Concatenate()([uninteresting, interesting, more_uninteresting])
y = model(inputs)
y_list = tf.unstack(y[0])
x = np.random.random((1,num_features))

# Create Jacobian matrix
jacobian_matrix = []
for i in y_indices:
    J = tf.gradients(y_list[i], interesting)
    jacobian_func = K.function([input_, K.learning_phase()], J)
    jac = jacobian_func([x, False])[0][0]
    jacobian_matrix.append(jac)
jacobian_matrix = np.array(jacobian_matrix)

Upvotes: 0

DomJack
DomJack

Reputation: 4183

The issue is with the line J = tf.gradients(y_list[i], x_list[j]). x_list[j] was derived from model.input[0], but there's no directed path from x_list[j] to model.output[0]. You need to either unstack the model input, restack then run the model, or create the derivative with respect to the entire input and just select the jth row from there.

First way:

inputs = tf.keras.Inputs((100,))
uninteresting, interesting, more_uninteresting = tf.split(inputs, [50, 10, 40], axis=1)
inputs = tf.concat([uninteresting, interesting, more_uninteresting], axis=1)
model = Model(inputs)
...
J, = tf.gradients(y_list[i], interesting)

Second way:

J, = tf.gradients(y_list[i], model.input[0])
J = J[:, 50:60]

Having said that, this is still going to be slow for a large number of y indices, so I'd strongly encourage you to be absolutely sure you need the Jacobian itself (and not, for example, the result of a Jacobian-vector product).

Upvotes: 1

Related Questions