Reputation: 13
Background : I used SpringKafka to implement Avro based Consumer and Producer. Other important components : Kafka-Broker, Zookeeper and Schema-registry runs in a docker container. And this works perfectly fine for me.
What I Want : I want to have a Kafka Avro Deserializer(in Consumer) which should be independent of schema-registry. In my case, I have a avroSchema file which would not change. So I want to get rid of this additional step of using schema-registry on Consumer side and Rather go for a local schema file
Upvotes: 1
Views: 1923
Reputation: 191671
If the Producer serializer uses the Schema Registry, then the Consumer should as well. Avro requires you to have a reader and writer schema.
If the consumer, for whatever reason cannot access the Registry over the network, you would need to use ByteArrayDeserializer
, then you would take the byte-slice after position 5 (0x0
+ 4 byte schema integer ID) of the byte[]
from the part of the record you want to parse.
Then, from Avro API, you can use GenericDatumReader
along with your local Schema
reference to get a GenericRecord
instance, but this would assume your reader and writer schema are the exact same (but this shouldn't be true, as the producer could change the schema at any time).
Or you can create a SpecificRecord from the schema you have, and configure the KafkaAvroDeserializer
to use that.
Upvotes: 2