Anil Kumar Martha
Anil Kumar Martha

Reputation: 67

Tensorflow - TypeError when using tf.cond()

Below is the snippet I have used in Tensorflow 2.6.0

import tensorflow as tf
x = tf.keras.Input(shape=(224, 224, 3), batch_size=None)
x1=tf.keras.Input(1, dtype=tf.int32)
y = tf.cond(tf.less(x1,5), lambda :tf.keras.layers.ReLU()(x), lambda :tf.keras.layers.LeakyReLU(alpha=0.1)(x))
model=tf.keras.models.Model(inputs=[x,x1], outputs=[y])
model.summary()

Error: python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 969, in convert (str(python_func), type(x)))

TypeError: To be compatible with tf.eager.defun, Python functions must return zero or more Tensors; in compilation of <function at 0x7fe4743fde60>, found return value of type <class 'keras.engine.keras_tensor.KerasTensor'>, which is not a Tensor.

Upvotes: 1

Views: 1828

Answers (1)

AloneTogether
AloneTogether

Reputation: 26708

The problem is that tf.cond will not work with KerasTensors (tensors that are returned by Keras layers). You could try wrapping tf.cond in a custom layer:

import tensorflow as tf

class ConditionalActivationLayer(tf.keras.layers.Layer):

  def call(self, inputs):
    x1, x = inputs[0], inputs[1]
    return tf.cond(tf.less(x1,5), lambda :tf.nn.relu(x), lambda :tf.nn.leaky_relu(x, alpha=0.1))

x = tf.keras.Input(shape=(224, 224, 3), batch_size=None)
x1= tf.keras.Input(1, dtype=tf.int32)
y = ConditionalActivationLayer()([x1, x])
model=tf.keras.models.Model(inputs=[x,x1], outputs=[y])
model.summary()
odel: "model"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_6 (InputLayer)           [(None, 1)]          0           []                               
                                                                                                  
 input_5 (InputLayer)           [(None, 224, 224, 3  0           []                               
                                )]                                                                
                                                                                                  
 conditional_activation_layer (  (None, 224, 224, 3)  0          ['input_6[0][0]',                
 ConditionalActivationLayer)                                      'input_5[0][0]']                
                                                                                                  
==================================================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0
__________________________________________________________________________________________________

Also works with:

return tf.cond(tf.less(x1,5), lambda :tf.keras.layers.ReLU()(x), lambda :tf.keras.layers.LeakyReLU(alpha=0.1)(x))

It is just a matter of taste.

You could also disable eager execution and it should work, since the input layers are now normal tensors:

import tensorflow as tf
tf.compat.v1.disable_eager_execution()

x = tf.keras.Input(shape=(224, 224, 3), batch_size=None)
x1=tf.keras.Input(1, dtype=tf.int32)
print(type(x1), type(x)) #<class 'tensorflow.python.framework.ops.Tensor'> <class 'tensorflow.python.framework.ops.Tensor'>
y = tf.cond(tf.less(x1,5), lambda :tf.keras.layers.ReLU()(x), lambda :tf.keras.layers.LeakyReLU(alpha=0.1)(x))
model=tf.keras.models.Model(inputs=[x,x1], outputs=[y])
model.summary()

Upvotes: 3

Related Questions