Mr.Eddart
Mr.Eddart

Reputation: 10273

Direct Kafka Topic to Database table

Is there a way to automatically tell Kafka to send all events of a specific topic to a specific table of a database?

In order to avoid creating a new consumer that needs to read from that topic and perform the copy explicitly.

Upvotes: 1

Views: 3763

Answers (1)

Iñigo González
Iñigo González

Reputation: 3955

You have two options here:

  • Kafka Connect - this is the standard way to connect your Kafka to a database. There are a lot of connectors. In order to choose one:

    • The best bet is to use the specific one for your database that is maintained by confluent.
    • If you don't have a specific one, the second best option is to use the JDBC connector.
  • Direct ingestion from the database if your database supports it (for instance Clickhouse, and MemSQL are able to load data coming from a Kafka topic). The difference between this and Kafka connects is this way it is fully supported and tested by the db vendor and you need to maintain less pieces of infrastructure.

Which one is better? It depends on:

  • your data volume
  • how much you can (and need !) to paralelize the load
  • and how much you can tolerate downtime or latencies.

Direct ingestion from DB is usually from one node (consumer) to Kafka. It is good for mid-low volume data traffic. If it fails (or throttles), you might have latency issues.

Kafka connect allows you to insert data in parallel into the db using several workers. If one of the worker fails, the load is redistributed among the others. If you have a lot of data, this probably the best way to load it into the db, but you'll need to take care of the kafka connect infrastructure unless you're using a managed cloud offering.

Upvotes: 1

Related Questions