hrzz
hrzz

Reputation: 45

How to fix "Invalid argument: Key: feature. Can't parse serialized Example."

I am trying to use TFRecordDataset to load my data, however, I got stuck in this error, and I searched a lot but still cannot fix it.

My TensorFlow version is 1.14.0.

Write to tfrecords:

raw_data = pd.read_csv(data_file, header=None, delim_whitespace=True)
data_x = raw_data.iloc[:, 1:-4].values
data_y = raw_data.iloc[:, -3:].values
writer = tf.io.TFRecordWriter('test.tfrecord')
for i in range(100):
    example = tf.train.Example(features=tf.train.Features(
        feature={
            'feature': tf.train.Feature(float_list=tf.train.FloatList(value=data_x[i])),
            'target': tf.train.Feature(float_list=tf.train.FloatList(value=data_y[i]))
        }))
    writer.write(example.SerializeToString())

where shape of data_x is (n, 736) and that of data_y is (n, 3).

Parse tfrecords:

def parse_function(record):
    features = {
        'feature': tf.FixedLenFeature([736], dtype=tf.float32),
        'target': tf.FixedLenFeature([3], dtype=tf.float32)
    }
    example = tf.io.parse_single_example(record, features)
    return example['feature'], example['target']

Then read the data from tfrecords:

dataset = tf.data.TFRecordDataset('test.tfrecord')
dataset = dataset.shuffle(BUFFER_SIZE)
dataset = dataset.map(parse_function)
dataset = dataset.batch(BATCH_SIZE)
dataset = dataset.prefetch(BUFFER_SIZE)

In the same way to create the test_dataset. Then build and compile the model:

model = keras.Sequential([
        keras.layers.Dense(400, activation=tf.nn.tanh),
        keras.layers.Dense(400, activation=tf.nn.tanh),
        keras.layers.Dense(400, activation=tf.nn.tanh),
        keras.layers.Dense(3)
    ])
#     print(model.summary())
optim = tf.train.AdamOptimizer(learning_rate=LEARNING_RATE)
model.compile(optimizer=optim,
                  loss=rmse_and_norm_mae,
                  metrics=[rmse_and_norm_mae])

Finally, train the model and trigger the error:

cp_callback = keras.callbacks.ModelCheckpoint(save_weights_path, verbose=0, save_weights_only=True,save_freq=SAVE_PERIOD)
model.fit(dataset, epochs=EPOCHS,steps_per_epoch=10, validation_data=test_dataset,validation_steps=10, callbacks=[cp_callback], verbose=2)

Error:

InvalidArgumentError: 2 root error(s) found.
  (0) Invalid argument: Key: feature.  Can't parse serialized Example.
     [[{{node ParseSingleExample/ParseSingleExample}}]]
     [[IteratorGetNext_64]]
     [[training_32/gradients/loss_16/dense_139_loss/Sum_grad/Shape/_2055]]
  (1) Invalid argument: Key: feature.  Can't parse serialized Example.
     [[{{node ParseSingleExample/ParseSingleExample}}]]
     [[IteratorGetNext_64]]
0 successful operations.
0 derived errors ignored.

How can I make it work? Thank you very much in advance for your help!

Upvotes: 3

Views: 4458

Answers (2)

WingedRasengan927
WingedRasengan927

Reputation: 150

in tf2, you can use features with variable shape as shown below:

'feature': tf.VarLenFeature(dtype=tf.float32)

Upvotes: 0

Prasad
Prasad

Reputation: 6034

The problem is while you are creating data. Your data_x is of shape (100, 734) and not (100, 736). You are excluding the first column and the 736th column when you are running this line:

data_x = raw_data.iloc[:, 1:-4].values

If this is a desired way to get rid of those two columns, you will have to specify size of 734 in tf.FixedLenFeature as shown:

'feature': tf.FixedLenFeature([734], dtype=tf.float32),

Upvotes: 4

Related Questions