Kermit
Kermit

Reputation: 5992

Does `model.predict(x,batch_size=n)` use multiple cores?

When running model.predict(features,batch_size=n) with a batch_size>1, are those predictions made in parallel - aka across multiple cores at once?

https://github.com/tensorflow/tensorflow/blob/eb7bf384017fd4eabdf500687bec552971cbd41c/tensorflow/python/keras/engine/training_arrays_v1.py

I need to permute millions of columns, so I was hoping to infer many predictions simultaneously using multiprocessing. However, there are major challenges with this: Keras + Tensorflow and Multiprocessing in Python

One of my friends suggested stacking the data for use with batch_size as a workaround to increase performance, but I am wondering what the potential gains would be as it would be a big rewrite.

Upvotes: 3

Views: 1378

Answers (1)

Sascha Kirch
Sascha Kirch

Reputation: 514

It depends on how you parametrize the function call of tf.keras.model.predict()

In the documentation you can find the parameters of predict, which has a use_multiprocessing parameter:

predict(
    x, batch_size=None, verbose=0, steps=None, callbacks=None, max_queue_size=10,
    workers=1, use_multiprocessing=False
)

Besides multiprocessing, predicting on batches instead of single samples benefits from vectorization, which additionally speeds up the prediction.

Upvotes: 2

Related Questions