Reputation: 604
Does anybody knows the maximum lengths? Google docs does not include it. https://cloud.google.com/appengine/docs/python/tools/protorpc/messages/fieldclasses
Upvotes: 0
Views: 116
Reputation: 991
Protobufs encode strings and bytes as an unlimited number of bytes preceded by an integer length. The length is encoded as a varint that can hold any value. I.e. a varint does not have a fixed size like a C int or a Java long. Thus theoretically you can stuff as much data into a string or bytes field as you have memory to support.
However, protobufs are usually processed by native implementations in Java, Go, Python, etc. These languages do have maximum lengths. For instance, in Java, where Strings are implemented as arrays and arrays are indexed by ints, the maximum size of a string and the maximum length of an array is about 2.1 billion characters (2^31 -1 to be precise). Thus any longer string in a protobuf will not be interoperable and may require special code to read and write.
That's the theory. Now in practice, if you have data that's over a gigabyte in size, don't try and pass it around with protorpc (or any RPC format) at all. You'll need a service and API designed to handle large data such as Blobstore. Your protorpc code can work with the URLs to your blobs.
Upvotes: 1