mr.m
mr.m

Reputation: 332

OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did convert this function for a customized model

I'm trying to implement "attention is all you need" paper to time series with a few tweaks, but I'm getting this error:

OperatorNotAllowedInGraphError: iterating over tf.Tensor is not allowed: AutoGraph did convert this function.

Code:

import tensorflow as tf
from tensorflow import keras

class Attention(tf.keras.layers.Layer):

    def __init__(self, dk, dv, num_heads, filter_size):
        super().__init__()
        self.dk = dk
        self.dv = dv
        self.num_heads = num_heads
        
        self.conv_q = tf.keras.layers.Conv1D(dk * num_heads, filter_size, padding='causal')
        self.conv_k = tf.keras.layers.Conv1D(dk * num_heads, filter_size, padding='causal')
        self.dense_v = tf.keras.layers.Dense(dv * num_heads)
        self.dense1 = tf.keras.layers.Dense(dv, activation='relu')
        self.dense2 = tf.keras.layers.Dense(dv)
        
    def split_heads(self, x, batch_size, dim):
        x = tf.reshape(x, (batch_size, -1, self.num_heads, dim))
        return tf.transpose(x, perm=[0, 2, 1, 3])
    
    def call(self, inputs):
        batch_size, time_steps, _ = tf.shape(inputs)
        
        q = self.conv_q(inputs)
        k = self.conv_k(inputs)
        v = self.dense_v(inputs)
        
        q = self.split_heads(q, batch_size, self.dk)
        k = self.split_heads(k, batch_size, self.dk)
        v = self.split_heads(v, batch_size, self.dv)
        
        mask = 1 - tf.linalg.band_part(tf.ones((batch_size, self.num_heads, time_steps, time_steps)), -1, 0)
        
        dk = tf.cast(self.dk, tf.float32)
        
        score = tf.nn.softmax(tf.matmul(q, k, transpose_b=True)/tf.math.sqrt(dk) + mask * -1e9)
        
        outputs = tf.matmul(score, v)
        
        outputs = tf.transpose(outputs, perm=[0, 2, 1, 3])
        outputs = tf.reshape(outputs, (batch_size, time_steps, -1))
        
        outputs = self.dense1(outputs)
        outputs = self.dense2(outputs)
        
        return outputs

class Transformer(tf.keras.models.Model):
    """
    Time Series Transformer Model
    """
    def __init__(self, dk, dv, num_heads, filter_size):
        super().__init__()

        self.attention = Attention(dk, dv, num_heads, filter_size)
        self.dense_sigma = tf.keras.layers.Dense(1)
    
    def call(self, inputs):
        outputs = self.attention(inputs)
        sigma = self.dense_sigma(outputs)
        
        return sigma

Mymodel= Transformer(3,3,4,3)

Mymodel.compile(loss="mean_squared_error",
    optimizer=keras.optimizers.Adam(learning_rate=1e-4),)

Mymodel.fit(X_train,Y_train,epochs=10,batch_size=32)
#X_train & Y_train are numpy array with shape ( batch_size , timesteps , no.of features )

Full Error :

/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
    984           except Exception as e:  # pylint:disable=broad-except
    985             if hasattr(e, "ag_error_metadata"):
--> 986               raise e.ag_error_metadata.to_exception(e)
    987             else:
    988               raise

OperatorNotAllowedInGraphError: in user code:

    /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py:855 train_function  *
        return step_function(self, iterator)
    <ipython-input-20-56419dd4aeb6>:61 call  *
        outputs = self.attention(inputs)
    <ipython-input-3-a936077354d3>:24 call  *
        batch_size, time_steps, _ = tf.shape(inputs)
    /usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py:520 __iter__
        self._disallow_iteration()
    /usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py:513 _disallow_iteration
        self._disallow_when_autograph_enabled("iterating over `tf.Tensor`")
    /usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py:491 _disallow_when_autograph_enabled
        " indicate you are trying to use an unsupported feature.".format(task))

    OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature.

Upvotes: 3

Views: 6165

Answers (1)

Kaveh
Kaveh

Reputation: 4960

It seems that in graph mode, for unpacking a tensor it tries to iterate over result. Anyway, you may use this instead:

batch_size = tf.shape(inputs)[0]
time_steps = tf.shape(inputs)[1]

My first recommendation was using .shape, however, I modified my answer because of this hint from TensorFlow docs here:

tf.shape(x) and x.shape should be identical in eager mode. Within tf.function, not all dimensions may be known until execution time. Hence when defining custom layers and models for graph mode, prefer the dynamic tf.shape(x) over the static x.shape.

Upvotes: 3

Related Questions