Bryon
Bryon

Reputation: 113

How do I build an input_fn for Estimator using images stored in a TFRecords file

Is there an example of how to construct the input_fn needed by tf.contrib.learn.Estimator for an image classification model? My images are stored in multiple TFRecords files.

Using tf.contrib.learn.read_batch_record_features, I am able to generate batches of encoded image strings. However, I don't see an easy way to convert these strings into images.

Upvotes: 4

Views: 2989

Answers (1)

Hamed MP
Hamed MP

Reputation: 5503

Referring here you can use something like below for mnist and fashion-mnist dataset stored in train.tfrecords and test.tfrecords.

The conversion to tfrecords is done by code here and you need to have a parser to get back the original image and label.

def parser(serialized_example):
  """Parses a single tf.Example into image and label tensors."""
  features = tf.parse_single_example(
      serialized_example,
      features={
          'image_raw': tf.FixedLenFeature([], tf.string),
          'label': tf.FixedLenFeature([], tf.int64),
      })
  image = tf.decode_raw(features['image_raw'], tf.uint8)
  image.set_shape([28 * 28])

  # Normalize the values of the image from the range [0, 255] to [-0.5, 0.5]
  image = tf.cast(image, tf.float32) / 255 - 0.5
  label = tf.cast(features['label'], tf.int32)
  return image, label

After having parser the rest is simple, you just need to call TFRecordDataset(train_filenames) and then map the parser function to each element so you will get an image and label as output.

# Keep list of filenames, so you can input directory of tfrecords easily
training_filenames = ["data/train.tfrecords"]
test_filenames = ["data/test.tfrecords"]

# Define the input function for training
def train_input_fn():
  # Import MNIST data
  dataset = tf.contrib.data.TFRecordDataset(train_filenames)

  # Map the parser over dataset, and batch results by up to batch_size
  dataset = dataset.map(parser, num_threads=1, output_buffer_size=batch_size)
  dataset = dataset.batch(batch_size)
  dataset = dataset.repeat()
  iterator = dataset.make_one_shot_iterator()

  features, labels = iterator.get_next()

  return features, labels

Upvotes: 4

Related Questions