Reputation: 1104
When we instantiate gRPC server we pass ThreadPoolExecutor
to its constructor:
grpc.server(futures.ThreadPoolExecutor(max_workers=1))
As I know, Python has GIL that makes useless usage of threads for CPU-bound tasks.
For example, my gRPC server serves Tensorflow model and gRPC health checking service. Are there any benefits for me to increase the number of threads in thread pool?
Upvotes: 3
Views: 3676
Reputation: 960
right now grpc python is tied to concurrent futures threadpool, I believe the roadmap shows in the future we'll be able to use asyncio. The threadpool executor just allows us to specify the maximum number of concurrent connections. You are correct that with python we have the GIL, as a result the threads are limited to the same CPU core therefore we don't have parallelism with threads. If your tasks are CPU bound they don't benefit in python by adding more threads.
It really depends on the details of your application whether more threads in the threadpool would have any benefit. You could run your grpc server in one process and your tensorflow in another process (using multiprocessing module), this gives you access to more cores, but you'll need to determine your protocol for transferring information from your tensorflow process to your grpc server.
Upvotes: 3