Reputation: 2772
I want to use fit_generator to handle my data.
I understand the generator must be running forever, and that samples_per_epoch defines the number of elements yielded from the generator before getting to the next epoch.
But what is an epoch here ? When we run fit, an epoch is a run against the whole data set, split according to batch_size. But here, batch_size has no meaning since fit_generator works like calling train_on_batch on each element of the generator, and there is no such thing as "the whole dataset".
My question is: do samples_per_epoch and nb_epoch have a meaning? Or in any case the network is trained the same way on samples_per_epoch*nb_epoch batches returned by the generator, and the epoch has no real meaning?
Upvotes: 3
Views: 1529
Reputation: 9099
fit_generator
is blind to what data is produced by the generator. Its the responsibility of the generator class to go over the whole dataset. Using samples_per_epoch
the fit_generator
simply keeps track of the epoch count. Check here & here & here.
Upvotes: 3