Wanna-be Coder
Wanna-be Coder

Reputation: 169

Tensorflow - Generating a Tensorboard embedding with batching?

Pretty straight forward question:

There's a great example from the 2017 Tensorflow Developer Summit for how to use Tensorboard and generate an embedding with the MNIST dataset here that made it pretty easy to figure out how to implement one with my own dataset.

However, with their example, they generate their embedding using 1024 images in one run without any sort of batching. Is there any way to generate it via batching? There doesn't seem to be an obvious way to do so.

For example, if I want an embedding of 1000 images but can't compute all 1000 images at once, I want to "store" them in batches of say 50, and then the end result is still the 1000 images.

Thanks!

Upvotes: 0

Views: 130

Answers (1)

Alexandre Passos
Alexandre Passos

Reputation: 5206

The code computes the embedding for the 1024 images when it runs sess.run(assignment, feed_dict={x: mnist.test.images[:1024], y: mnist.test.labels[:1024]}). If you want to compute the embedding for just a subset of images, feed this subset and store those embeddings for that subset only (so, use many variables instead of a single one), while keeping the rest of the code the same.

Upvotes: 1

Related Questions