Reputation: 9771
Does kafka stream / Ksql actually support json natively somehow? What are the other formats supported ? I have seen that is it possible to have flat json interpreted as table. I want to understand that part a bit better; what are the other formats that kafka-streams via Ksql that can be queried via SQL? How is that possible or supported? What's the native support?
Upvotes: 2
Views: 434
Reputation: 3333
KSQL
For Value formats, KSQL Supports AVRO, JSON and DELIMITED (like CSV).
You can find the documentation here:
Kafka Streams
Kafka Streams comes with some primitive/basic SerDes (Serializers / Deserializers) under the org.apache.kafka.common.serialization
package.
You can find the documentation here:
Confluent also provides schema-registry compatible Avro SerDes for data in generic Avro and in specific Avro format. You can find the documentation here:
You can also use basic SerDe implementation for JSON that comes with the examples:
As a last resort, you can always create your own custom SerDes. For that, you must:
T
by implementing
org.apache.kafka.common.serialization.Serializer
.T
by implementing
org.apache.kafka.common.serialization.Deserializer
.T
by implementing
org.apache.kafka.common.serialization.Serde
, which you either do
manually (see existing SerDes in the previous section) or by
leveraging helper functions in Serdes such as
Serdes.serdeFrom(Serializer<T>, Deserializer<T>)
. Note that you will
need to implement your own class (that has no generic types) if you
want to use your custom serde in the configuration provided to
KafkaStreams. If your serde class has generic types or you use
Serdes.serdeFrom(Serializer<T>, Deserializer<T>)
, you can pass your
serde only via methods calls (for example
builder.stream("topicName", Consumed.with(...))
).Upvotes: 4