Pintu
Pintu

Reputation: 102

Sending avro schema along with payload

I want to implement avro serializer/deserializer for Kafka producer/consumer. There can be multiple scenarios

  1. Writer schema and reader schema are same, will never change. In such scenario, no need to send avro schema along with payload. At consumer we can use reader schema itself to deserialise payload. Sample implementation is provided in this post
  2. Using schema resolution feature when schema will evolve over time. So avro can still deserialize different reader and writer schema using schema resolution rules. So we need to send avro scehma along with payload

My Question How to send schema as well while producing, so that deserialiser read whole bytes and separate out actual payload and schema ? I am using avro generated class. Note, I don't want to use schema registry.

Upvotes: 0

Views: 1026

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191884

You need a reader and writer schema, in any Avro use-case, even if they are the same. SpecificDatumWriter (for serializer) and SpecificDatumReader (for deserializer) both take a schema.

You could use Kafka record headers to encode the AVSC string, and send along with the payload, but keep in mind that Kafka records/batches have an upper-bound in allowed size. Using some Schema Registry (doesn't have to be Confluent's), reduces the overhead from a whole string to a simple integer ID.

Upvotes: 1

Related Questions