Reputation: 2697
I am looking at the text generation example of Keras using RNN and LSTM but still confused about the difference between the terms epoch and iteration.
Even though, here is a previous question asking the same thing, I cannot understand the answer, or this answer is different from how I understand it and also different from how the following example handles it. Based on this answer, it is said that
one epoch = one forward pass and one backward pass of all the training examples
number of iterations = number of passes, each pass using [batch size] number of examples.
Example: if you have 1000 training examples, and your batch size is 500, then it will take 2 iterations to complete 1 epoch.
Conclude this: (#training examples/batch size) = (#iterations/#epochs)
.
However, the following example, as I understand, is different from the previous conclusion.
# train the model, output generated text after each iteration
for iteration in range(1, 60):
print()
print('-' * 50)
print('Iteration', iteration)
model.fit(X, y, batch_size=128, nb_epoch=1)
start_index = random.randint(0, len(text) - maxlen - 1)
for diversity in [0.2, 0.5, 1.0, 1.2]:
print()
print('----- diversity:', diversity)
generated = ''
sentence = text[start_index: start_index + maxlen]
generated += sentence
print('----- Generating with seed: "' + sentence + '"')
sys.stdout.write(generated)
for i in range(400):
x = np.zeros((1, maxlen, len(chars)))
for t, char in enumerate(sentence):
x[0, t, char_indices[char]] = 1.
preds = model.predict(x, verbose=0)[0]
next_index = sample(preds, diversity)
next_char = indices_char[next_index]
generated += next_char
sentence = sentence[1:] + next_char
sys.stdout.write(next_char)
sys.stdout.flush()
print()
Here, the iteration is 60 and number of epoch is set to 1, which confused me a lot. It seems like, there are 60 iterations, as stated for iteration in range(1, 60)
. And for each of the iteration, one epoch was done as stated model.fit(X, y, batch_size=128, nb_epoch=1)
for each for-loop. And again, here, there is a batch_size=128
. So what does the iteration exactly mean?
Anyone can explain the difference between iteration and epoch based on this example?
Upvotes: 3
Views: 5719
Reputation: 447
In this case iteration using just for displaing intermediate result. We can delete this code:
for diversity in [0.2, 0.5, 1.0, 1.2]:
print()
print('----- diversity:', diversity)
generated = ''
sentence = text[start_index: start_index + maxlen]
generated += sentence
print('----- Generating with seed: "' + sentence + '"')
sys.stdout.write(generated)
for i in range(400):
x = np.zeros((1, maxlen, len(chars)))
for t, char in enumerate(sentence):
x[0, t, char_indices[char]] = 1.
preds = model.predict(x, verbose=0)[0]
next_index = sample(preds, diversity)
next_char = indices_char[next_index]
generated += next_char
sentence = sentence[1:] + next_char
sys.stdout.write(next_char)
sys.stdout.flush()
print()
and instead:
for iteration in range(1, 60):
model.fit(X, y, batch_size=128, nb_epoch=1)
write:
model.fit(X, y, batch_size=128, nb_epoch=60)
Upvotes: 1
Reputation: 40506
I think that in this example the iteration means something different : you are iterating through the learning process and after every epoch you are doing something with partially learnt model. You are doing it iteratievly and that's why an iteration word is used.
Upvotes: 3