Lorenzo Gagliardi
Lorenzo Gagliardi

Reputation: 1

How do you feed Ragged Tensors to a DNN trained by TensorFlow Extended?

We are developing a ML pipeline with TFX, with the most common components such as ExampleGen, Transform, Trainer, and so on. The examples that have to be fed to the DNN have varying length, so we decided to use the Ragged Tensors to enable an input of non-fixed dimension and avoid padding. However it seems that the Trainer component tries automatically to convert the input to a Tensor or something like that.

TypeError: Failed to convert object of type 'tensorflow.python.ops.ragged.ragged_tensor.RaggedTensor'> to Tensor. Contents: >tf.RaggedTensor(values=Tensor("Placeholder:0", shape=(None, 605), dtype=float32), >row_splits=Tensor("Placeholder_1:0", shape=(None,), dtype=int64)). Consider casting elements to a >supported type.

The model is a Keras Sequential DNN for structured data, with mostly Dense layers. Is it possible use Ragged tensors for a pipeline written in TFX? Does Keras support ragged tensors?

Thank you all!

Btw we are using:

Upvotes: 0

Views: 543

Answers (1)

Pritam Dodeja
Pritam Dodeja

Reputation: 326

A Ragged Tensor is essentially a list with a variable number of elements. The issue w/ putting this in a DNN is that you don't know how many inputs you will get, hence you cannot create a graph ahead of time that does that computation. Ragged Tensors not going into Dense layers is a restriction placed by Linear Algebra, and not tensorflow. If you share more details to make your problem reproducible, I can try and help.

Upvotes: 0

Related Questions