JDev
JDev

Reputation: 1832

Is it a good practice to use the exiting topic for multiple connectors?

I am using the Debezium PostgreSQL connector to get the users table into a Kafka Topic.

I have a JDBC Sink Connector connector that then reads the data from the topic and pushes it into it's own Database.

Now, I need a subset of the data for another Microservice Database. So I am planning to write another JDBC Sink Connector.

The Question: is it a good practice to use the existing users table topic? If yes, then how I can make sure that new JDBC connector get's a snapshot of entire users table  

Upvotes: 1

Views: 371

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191874

If Debezium snapshotted the table and data hasn't been lost in the topic due to retention, then that's what any sink or other consumer will read.

Any unique sink connector name will read unique offsets from its topic. Nothing bad will happen with multiple consumers reading the same topic; this is how Kafka is intended to be used.

You may need to ensure consumer.auto.offset.reset=earliest for connect to read from the start of the topic

To get a subset of fields, you'll need to "replace" them - https://docs.confluent.io/platform/current/connect/transforms/replacefield.html#replacefield

Upvotes: 1

Related Questions