skidjoe
skidjoe

Reputation: 649

Converting a list of dictionaries to a tf dataset

I have a dictionary which has been completely preprocessed and is ready to feed in to a BERT model. However, I am struggling a lot to get it into a tf.dataset. This is what my one element of my dataset looks like: print(dataset[0])

{'input_ids': <tf.Tensor: shape=(128,), dtype=int64, numpy= array([  101,   171,   112,  2537, 12293,   131, 11250,   118,   118,
        2537, 12293,   131, 11250,  1110,  1126,  1237,  1778,  1326,
        1687,  1111,  5957,  1398, 11737,  1118,  8129, 14399,  1105,
        3230,  9426, 27277,   119,  1135,  1110,  1103,  1148,  1326,
        1872,  4418,  1111,  1115,  1555,   117,  1105,  1103,  1148,
        2537, 12293,  1326,  1290,  2537, 12293,   131,  9892,  4803,
        1107,  1478,   119,  9617,  4986,   170,  4967,  1196,  1103,
        1958,  1104,  1103,  1560,  2537, 12293,  1326,  1105,  2767,
        1121,  1103, 21169,  1104,  1103, 18061,  1666,  2672,  2441,
         117, 11250, 16001,  1103,  4245,   118,   118,   148,  1979,
        1320,  1594,  1229,  1378,  1103,  3039,  1104,  1103,  6684,
       11250,   119, 23886,   147,   119, 16218,  1105,  6619, 11679,
       19644,  2145,  2867,  1112,  1437, 14627,   102,   171,   112,
        1110,  1175,   170,  1207,  2851,   189, 14909,  1326,  1909,
         112,   102])>, 'input_mask': <tf.Tensor: shape=(128,), dtype=int64, numpy= array([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])>, 'segment_ids': <tf.Tensor: shape=(128,), dtype=int64, numpy= array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])>, 'labels': <tf.Tensor: shape=(), dtype=int64, numpy=1>}

all I need to do is get it into the tf.data.Dataset() format, however, I cannot seem to figure out how to make any of the functions available from_tensor_slices, from_tensors, from_generator work with what I have.

Upvotes: 4

Views: 5861

Answers (2)

JumbaMumba
JumbaMumba

Reputation: 579

You can do this by using pandas (or you can just mimic the output of the to_dict method)

dataset = tf.data.Dataset.from_tensor_slices(pd.DataFrame.from_dict(records).to_dict(orient="list"))

where records is a list of dictionaries.

Upvotes: 6

user271687
user271687

Reputation: 1

I am also struggling with this and it is a little frustrating to see that their tensorflow_datasets module returns datapoints in dictionaries (e.g. https://www.tensorflow.org/tutorials/images/segmentation) and that no official function seems to exist to create a dataset from a list of dictionnaries.

I have came up with this workaround:

input_ids = tf.data.Dataset.from_tensor_slices([d['input_ids'] for d in dataset])
input_masks = tf.data.Dataset.from_tensor_slices([d['input_mask'] for d in dataset])
segment_ids = tf.data.Dataset.from_tensor_slices([d['segment_ids'] for d in dataset])
labels = tf.data.Dataset.from_tensor_slices([d['labels'] for d in dataset])

ds = tf.data.Dataset.zip((input_ids, input_masks, segment_ids, labels))
ds = ds.map(lambda x, y, z, l: {"input_ids": x, "input_masks": y,
                                "segment_ids": z, "labels": l}

However, it is not really scalable.

Upvotes: 0

Related Questions