Reputation: 657
I am receiving protobuf messages on kafka, the consumer is configured to deserialize the events using
value.deserializer = org.apache.kafka.common.serialization.StringDeserializer
If I use parseFrom(byte[] data)
method of com.google.protobuf.Parser
by passing byte array of the deserialized event string, the method throws following exception:
com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field. This could mean either than the input has been truncated or that an embedded message misreported its own length.
If I instead deserialize kafka events with
value.deserializer = org.apache.kafka.common.serialization.ByteArrayDeserializer
and directly pass the byte array thus received to parseFrom
, protobuf is correctly parsed without any exception.
Why does the second way work, but the first does not?
Upvotes: 4
Views: 3769
Reputation: 11880
You are using a String deserializer, which expects certain special characters to define the message's limits. It tries to deserialize a STRING but he receives just a bunch of bytes with a format the consumer doesn't expect at all.
You have a producer which serializes with ByteArraySerializer
, so your consumer must deserialize it with a ByteArrayDeserializer
.
Try producing with a org.apache.kafka.common.serialization.StringSerializer
if you really want to automatically deserialize in String format.
Upvotes: 2