Reputation: 1692
This question may have been asked but I failed to find it.
What is the simplest way to constantly get batches of data from a dataset? Is there a built-in tensorflow function to do so?
for instance:
for i in num_trains:
x_batch, y_batch = get_batch(x_train, y_train, batch_size)
sess.run(train_step, feed_dict={x:x_batch,y:y_batch})
If there is not such a built in function, how would you implement it? I tried myself but I could not figure out how I can get a new batch different from the previous ones each time I call the function.
Thanks!
Upvotes: 1
Views: 2783
Reputation: 17191
You can try:
# Feed batch data
def get_batch(inputX, inputY, batch_size):
duration = len(inputX)
for i in range(0,duration//batch_size):
idx = i*batch_size
yield inputX[idx:idx+batch_size], inputY[idx:idx+batch_size]
You can also use tensorflow's dataset API
as well:
dataset = tf.data.Dataset.from_tensor_slices((train_x, train_y))
dataset = dataset.batch(batch_size)
Getting the batch:
X = np.arange(100)
Y = X
batch = get_batch(X, Y, 5)
batch_x, batch_y = next(batch)
print(batch_x, batch_y)
#[0 1 2 3 4] [0 1 2 3 4]
batch_x, batch_y = next(batch)
print(batch_x, batch_y)
#[5 6 7 8 9] [5 6 7 8 9]
Typically for running over the dataset for multiple epochs
, you would do:
for epoch in range(number of epoch):
for step in range(size_of_dataset//batch_size):
for x_batch, y_batch in get_batch(x_train, y_train, batch_size):
sess.run(train_step, feed_dict={x:x_batch,y:y_batch})
Using the dataset API
:
dataset = tf.data.Dataset.from_tensor_slices((X, Y))
dataset = dataset.batch(5)
iterator = dataset.make_initializable_iterator()
train_x, train_y = iterator.get_next()
with tf.Session() as sess:
sess.run(iterator.initializer)
for i in range(2):
print(sess.run([train_x, train_y]))
#[array([0, 1, 2, 3, 4]), array([0, 1, 2, 3, 4])]
#[array([5, 6, 7, 8, 9]), array([5, 6, 7, 8, 9])]
Upvotes: 2