Reputation: 3148
I am wondering if it is allowed to produce a message with the JSON key schema and AVRO value schema.
Is there a restriction on mix and matching schema types of the producer record?
Does the schema registry ban it? Is it considered a bad practice?
Upvotes: 0
Views: 1313
Reputation: 191671
There is no such limitation for mixing of messages. Kafka stores bytes; it doesn't care about the serialization, but a random consumer of the topic might not know how to consume the data, so using Avro (or Protobuf) for both would be a good idea
Upvotes: 1
Reputation: 139
You can use UTF-8 text for key and Avro for value. You can have both as Avro. When I tried to use Kafka REST and key was not in avro format while message was, I couldn't use the consumer. Looks like it's still the case based on this issue But if you implement your own producer and consumer you decide what to encode/decode which way.
You have to be careful with keys. As based on key, messages are sent to specific partitions, in most scenarios messages should be in time order, but that can be only achieved if they go to same partition. If you have userId
or any other identifier you likely want to send all events for this user to same partition, so use userId
as a key. I wouldn't use json as key unless your key is based on few fields, but you have to be careful, to not end up with messages on different partitions.
Upvotes: 0