Reputation: 586
I have a huge NumPy matrix of dimension (1919090, 140, 37). Now it is not easy to fit something that large anywhere in the memory local or on a server. So I was thinking of splitting the NumPy matrix into smaller parts say of (19,000, 140, 37) and then training a Keras model on it. I store the model, then loading it again and continue training on the next matrix portion. I repeat this until the model is trained on all the 100 or so matrix bits. Is there a way of doing it?
Upvotes: 0
Views: 109
Reputation: 56357
Yes, you can, but the concept is not called "stages" but batches and it is the most common method to train neural networks. You just need to make a generator function that loads batches of your data one at a time and use model.fit_generator
to start training it.
Upvotes: 1