Reputation: 5992
When running model.predict(features,batch_size=n)
with a batch_size>1
, are those predictions made in parallel - aka across multiple cores at once?
I need to permute millions of columns, so I was hoping to infer many predictions simultaneously using multiprocessing
. However, there are major challenges with this: Keras + Tensorflow and Multiprocessing in Python
One of my friends suggested stacking the data for use with batch_size as a workaround to increase performance, but I am wondering what the potential gains would be as it would be a big rewrite.
Upvotes: 3
Views: 1378
Reputation: 514
It depends on how you parametrize the function call of tf.keras.model.predict()
In the documentation you can find the parameters of predict, which has a use_multiprocessing
parameter:
predict(
x, batch_size=None, verbose=0, steps=None, callbacks=None, max_queue_size=10,
workers=1, use_multiprocessing=False
)
Besides multiprocessing, predicting on batches instead of single samples benefits from vectorization, which additionally speeds up the prediction.
Upvotes: 2