AnswerSeeker
AnswerSeeker

Reputation: 203

In Kafka, can I create a single kafka topic and have multiple producers write to it

I have the following use case : Log files that comes from a single data source to be pushed to a Kafka topic (Say Topic 1). There is a consumer who will read from it and converts to json format and writes back to another topic (Topic 2). Another consumer who is expecting data in json will read from Topic 2 will do some other modification and writes back to another Topic (Topic 3).

My question is instead of creating 3 different topics, can I create a single topic and have these multiple producers write to the same topic? How will my consumer know which partition to read from since a group id cannot be set for a producer? One solution I learnt from SO is to create partitions and make each producer to write to a particular partition alone. Problem with this approach is that number of producers and consumers might change and modifying the topic is not desired. Please advice.

Upvotes: 1

Views: 3542

Answers (1)

Garry
Garry

Reputation: 708

As some one already commented you should not push different type of schema's to single topic. Number of topics in Kafka is not an issue. You can use some nomenclature to manage them. like "topic1", "topic1_json", "topic1_modification".

If your use case have unmanageable list of topics, same consumer can read all json topics & you don't want the batching of same schema events at destination file system. Then you can follow below approach.

Create an Object with generic schema or setup some schema registry(check confluent schema registry). Where all your schemas fits as subRecord OR record will carry the schema information. Then create a single topic for all json responses(for ex: topic_json_generic). After reading the data from "topic1" push it to "topic_json_generic". Similar for further topic. At consumer level you can handle what needs to done with which type of object.

Upvotes: 1

Related Questions