Alexander Soare
Alexander Soare

Reputation: 3257

Tensorflow 2.0 stack() raises uninitialized tensors error

I am writing a custom layer where I need to loop through the batch dimension, then the rgb dimension of an image. I'm still trying to understand how Tensorflow implements for-loops, and I'm not sure that's related to the error I present here.

Here is some pseudo-code:

    @tf.function()
    def _crop_and_resize(self, imgs, boxes, to_size):
        # prepare kernel_h and kernel_w

        n_images = tf.shape(imgs)[0]
        outputs = tf.TensorArray(dtype=tf.float32, size=n_images)
        for i in tf.range(n_images):
            # in the call to _bilinear we enter the inner loop
            output = self._bilinear(
                kernel_h[i],
                kernel_w[i],
                imgs[i])
            outputs.write(i, output)
        return outputs.stack()


    def _bilinear(self, kernel_h, kernel_w, img):
        n_channels = tf.shape(img)[2]
        result_channels = tf.TensorArray(dtype=tf.float32, size=n_channels)
        for i in tf.range(n_channels):
            result_channels.write(i,
                tf.matmul(
                    tf.matmul(kernel_h, tf.tile(img[:, :, i], [1, 1])),
                    kernel_w, transpose_b=True))
        return tf.transpose(result_channels.stack(), perm=[1,2,0])

I'm getting the following error:

InvalidArgumentError: Tried to stack list which only contains uninitialized tensors and has a non-fully-defined element_shape: [?,?,?] [[{{node model_17/att_1/PartitionedCall/TensorArrayV2Stack/TensorListStack}}]] [Op:__inference_distributed_function_11150] Function call stack: distributed_function

I've seen many examples of using TensorArray and stack in this manner for a single for-loop, but I'm not sure if my nested for loop is causing an issue.

Upvotes: 2

Views: 1154

Answers (1)

Gabe
Gabe

Reputation: 1028

I had a similar issue, and resolved it from comments in this bug response: https://github.com/tensorflow/tensorflow/issues/30409#issuecomment-508962873

Basically when in eager mode the .stack() call works in-place as a convenience, but in a graph setting, you need to chain the .stack() calls as nodes in a graph, e.g.

outputs = outputs.write(i, output)

This solved it for me.

Upvotes: 4

Related Questions