Farhan Islam
Farhan Islam

Reputation: 649

Is it an antipattern to have a normalized Kafka Producer Microservice for several topics?

I was wanting to implement a Kafka GraphQL Microservice that would produce to several topics. This microservice would serve as a service that other microservices could easily send messages to. Each request to this producer microservice would contain the topic that it wants to publish to and the payload associated to that topic. In my mind this is a more streamlined approach than having every service that wants to publish to a topic have kafka library code and any other boilerplate code associated with being able to produce to a topic. All the Kafka-specific logic would live in the producer service reducing the amount of duplicated code across services.

Upvotes: 2

Views: 241

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191983

Not an anti-pattern at all. The Producer is also thread-safe.

Confluent's Kafka REST Proxy already does this by giving an array of partition+record objects to the endpoint

POST /clusters/{cluster_id}/topics/{topic_name}/records

Docs - https://docs.confluent.io/platform/current/kafka-rest/api.html#records-v3


However, depending on your use-case, I'd suggest implementing GraphQL resolvers with topics already hard-coded for the specific events you want to generate rather than user-supplied. In other words, prevent bad actors from populating any random topic with random data.


In my mind this is a more streamlined approach than having every service that wants to publish to a topic have kafka library code and any other boilerplate code associated with being able to produce to a topic

That "boilerplate" could be an external library on its own, such as autowired Spring-Kafka Beans

Upvotes: 2

Related Questions