Jameson_uk
Jameson_uk

Reputation: 487

Can the Kafka Connect JDBC Sink dump raw data?

Partly for testing and debugging but also to work around an issue we are seeing in a topic where we have are unable to change the producer I would like to be able to store the value as a string in a CLOB in a database table.

I have this working as a Java based consumer but I am looking at whether this could be achieved using Kafka Connect.

Everything I have read says you need a schema with the reasoning being that how else would it know how to process the data into columns (which makes sense) but I don't want to do any processing of the data (which could be JSON but might just be text) I just want to treat the whole value as a string and load it into one column.

Is there any way this can be done within the Connect config or am I looking at adding extra processing to update the message (in which case the Java client is probably going to end up being simpler)

Upvotes: 0

Views: 414

Answers (1)

Robin Moffatt
Robin Moffatt

Reputation: 32090

No, the JDBC Sink connector requires a schema to work. You could modify the source code to add in this behaviour.

I would personally try to stick with Kafka Connect for streaming data to a database since it does all the difficult stuff (scale out, restarts, etc etc etc) very well. Depending on the processing that you're talking about, it could well be that Single Message Transform would be very applicable, since they fit into the Kafka Connect pipeline. Or for more complex processing, Kafka Streams or ksqlDB.

Upvotes: 1

Related Questions