George
George

Reputation: 5681

memory error in predict_on_batch() on large data set

I have a test set of 18000 examples.

Χ_test.shape: (18000, 128, 128, 1)

I have trained my model and want to use predict on X_test.

If I try to just use:

pred = model.predict_on_batch(X_test)

it gives a memory error.

I tried something like:

X_test_split = X_test.flatten()
X_test_split = np.array_split(X_test_split, 562) # batch size is 32
pred = np.empty(len(X_test_split), dtype=np.float32)

for idx, _ in enumerate(X_test_split):
    pred[idx] = model.predict_on_batch(X_test_split[idx].reshape(32, 128, 128, 1))

but it either gives me memory error again or it gives me error about the reshape (depending on variations I am trying in the above code)

I have the same problem using predict_generator also.

Upvotes: 0

Views: 856

Answers (1)

today
today

Reputation: 33430

As requested by OP, I am posting my comment as an answer and try to elaborate more:

It seems your model size is big and therefore you need to either use a smaller batch size (< 32, since you mentioned it does not work with 32) or modify the model and decrease the number of parameters (e.g. removing some layers, decreasing the number of filter or units, etc.).

Upvotes: 1

Related Questions