Reputation: 3737
We use streaming RPCs to send big files to the GRPC server. Like this:
service FileReceiver
{
rpc addData(stream DataChunk) returns (Empty)
}
Is it possible to use proxy load balancer in this case, so that load balancer won't switch server in the middle of streaming request? Will it scale well with increased number of clients?
Upvotes: 4
Views: 4451
Reputation: 26474
HTTP load balancers typically balance per HTTP request. A gRPC stream is a single HTTP request, independent of how many messages are in the stream. Each client can be directed to a different backend, so it can scale. So gRPC behaves how you want out-of-the-box.
Streaming RPCs are stateful and so all messages must go to the same backend. This can be essential for result consistency (like with reflection) and helpful for performance in certain workloads (like your case).
One note about scalability though: if the streams are long-lived, you can have "hot spots" where certain backends have a high proportion of the streams. Your service can periodically (minutes or hours depending on your needs) close the stream and have the client re-create the stream to rebalance.
Upvotes: 16