Reputation: 102
I want to implement avro serializer/deserializer for Kafka producer/consumer. There can be multiple scenarios
My Question How to send schema as well while producing, so that deserialiser read whole bytes and separate out actual payload and schema ? I am using avro generated class. Note, I don't want to use schema registry.
Upvotes: 0
Views: 1026
Reputation: 191884
You need a reader and writer schema, in any Avro use-case, even if they are the same. SpecificDatumWriter
(for serializer) and SpecificDatumReader
(for deserializer) both take a schema.
You could use Kafka record headers to encode the AVSC string, and send along with the payload, but keep in mind that Kafka records/batches have an upper-bound in allowed size. Using some Schema Registry (doesn't have to be Confluent's), reduces the overhead from a whole string to a simple integer ID.
Upvotes: 1