Reputation: 1107
Suppose the message that I need to send from server to client is like this:
message BatchReply {
bytes data = 1;
repeated int32 shape = 2;
string dtype = 3;
repeated int64 labels = 4;
}
Here shape
/dtype
are only small variables and can be represented with a few int space, while data
/labels
are large memory buffers that can takes as much as 1G memory.
I am trying to send this message with stream:
service ImageService {
rpc get_batch (BatchRequest) returns (stream BatchReply) {}
}
My question is that the examples I could find to send message through stream are all about messages with only one field in the message struct, such as:
service TransferFile {
rpc Upload(stream Chunk) returns (Reply) {}
}
message Chunk {
bytes buffer = 1; // here is only on field of buffer, what if there is a field of int val = 2; ?
}
What if there are two fields in the struct of Chunk
. Do I need to call set_val()
each time when I call set_buffer()
during the same stream feeding process ?
Upvotes: 1
Views: 1775
Reputation: 1580
You can just send a message with multiple fields over grpc. That is the advantage of using protobuf.
I do not know if the transmission layer you are using is able to deal with such a large message as you are specifying. You could test that. If this does not work you can use the example you give of TransferFile
.
When looking at Chunk
I get the impression that they are sending segments of the whole data. On the other side they then reconstruct the segments into the complete set. The type used for Chunk
is bytes. Those are just raw bytes and can represent anything you would like.
To send your BatchReply
in chunks you can use the following steps:
BatchReply
object.Chunk
object. For example 100 bytes each time.Chunk
object using the TransferFile
interface.On the reception side you concatenate the chunks into one array and deserialize the array back to a BatchReply
object.
Upvotes: 0