mon
mon

Reputation: 22366

tensorflow/keras - does each layer run concurrently all the time?

Question

Does each layer in a Tensorflow/Keras sequential neural network (NN) work all the time? In a CPU, there are multiple pipeline stages and each stage keeps working and not waiting for the previous stage.

enter image description here Depth of a pipeline in a CPU's architecture

Suppose there is a network:

[matmul(0) -> batch-norm(1) -> activation(2) -> matmul(3) -> loss(4)].

While a batch i is being processed in the batch-norm(1) layer, next batch i+1 can be processed in matmul(0) like a stage in CPU. I wonder if such concurrent executions are happening, or all the GPU/CPU are dedicated to a single layer at a time.

I saw Tensorflow uses graphs and tf.function for executions, and suppose parallel/concurrent executions would be scheduled based on the graph. How layer execution is planned from the graph perspective?

Upvotes: 2

Views: 312

Answers (1)

Aditya Kane
Aditya Kane

Reputation: 411

@mon
Excellent question! AFAIK, TensorFlow doesn't execute layers separately. Once the model is compiled, the graph is executed all at once in CPU, without any parallel processing. In GPU environments, the parts of the graph which can be processed in parallel are in fact processed in parallel. The image that you shared is a low level illustration of how the CPU works, and I don't think it can be directly applied here. Please correct me if I am wrong.

Upvotes: 1

Related Questions