ZennKa
ZennKa

Reputation: 65

Tf.JS First model prediction takes too long

I have a face detection TF.js model and I would like to use it in the browser to detect faces for incoming video frames.

Usually, it takes around 10ms to run prediction for every new frame, but it takes around 2000ms for the first frame. Is it possible to do something with this, since the user sees the frozen video while the face prediction for the first frame is being executed?

What do I have:

Performance of the first prediction:

// tf.time() execution output
kernelMs: 187.4139999999999
uploadWaitMs: 82.7599998738151
wallMs: 2528.85500001139

What I tried:

I tried to warmup the model in a worker, make some predictions there, and continue using this model in the main thread. But I see one another warmup once started calling the model from the main thread.

Upvotes: 3

Views: 405

Answers (1)

edkeveked
edkeveked

Reputation: 18381

You need to warm up the model before use it for the first prediction.

Warming up the model means calling the model with a dump tensor, like a tf.zero with a size that fits your model input.

Then later you can use your model for prediction

Upvotes: 2

Related Questions