Reputation: 1247
I am migrating to kafka as broker and debezium to get data (ETL data) from all of the micro-services to reporting and search databases.
Is there any way to configure debezium so that it puts data on separate topics based on custom criteria (like users or company or on some key column/attribute of row/data).
Upvotes: 3
Views: 2599
Reputation: 51
Hi @Gunnar for the similar requirement is it possible to send the message from Debezium to different topics (single/multiple) based on a condition. Eg: for Table A event to topic A, for Table B event to topic B1 , topic B2 for Table C event to topic C1 , C2 and topic B1 etc In Source Connector or Regex Connector there is only option of setting one topic name in the class org.apache.kafka.connect.connector.ConnectRecord. Is there a way to set multiple topics. As explained send one event to different topics based on some business logic.
Upvotes: 0
Reputation: 19010
I'd suggest to implement a custom SMT (single message transform) which routes the records produced by the Debezium connector into the right topics. You can take Debezium's routing SMT linked in the answer by cricket_007 as an example for your custom implementation. Having the SourceRecord
available, you can decide about the destination topic based on all the captured table's column values.
Kafka Streams or similar would work, too, but I'd recommend to first look into SMTs due to the ease of operating (no separate process needed) and only look for alternatives if SMTs are not sufficient.
Upvotes: 0
Reputation: 191864
Not sure if you are looking for Topic Routing
Assuming you cannot add a filter option to Debezium itself, the typical pattern is to use Kafka Streams, KSQL (or Flink based on your previous question), to filter and dispurse the data you're interested in out into different topics that downstreams consumers would need.
From a single Debezium configuration, though, you have to hardcode a namespace/collection/table. You would need multiple configurations for multiple of those.
Upvotes: 1