Reputation: 1091
On the Nvidia GPU, we can have multiple kernels running concurrently by using the Streams. How about the Xeon Phi? If I offload two part of computation code by different threads, will they run concurrently on the Xeon Phi?
Upvotes: 2
Views: 445
Reputation: 59110
Yes you can have concurrent offload executions on the Xeon Phi, up to 64 by default.
See the --max-connections
parameter of the Coprocessor Offload Infrastructure (COI) daemon running on the Xeon Phi /bin/coi_daemon
:
--max-connections=<int> The maximum number of connections we allow from host
processes. If this is exceeded, new connections
are temporarily blocked. Defaults to 64.
Upvotes: 3