MaxG
MaxG

Reputation: 1077

Kafka Consumer per business use case - best way to implement

I'm using a single kafka topic for different types of related messages. Topic name is: apiEvents. Events are of type:

  1. ApiUpdateEvent
  2. EndpointUpdateEvent
  3. TemplateUpdateEvent

One of the applications I have, consumes all these events. Moreover - I want it to consume the same event differently (twice), in two unrelated use cases.

For example, two use cases for the same event (EndpointUpdateEvent):

  1. I'd like create a windowed time frame of 500ms and respond to an aggregation of all the events that came in this time frame - ONCE!
  2. These same events as stated in section (1) - I want to respond to each one individually, by triggering some DB operation.

Thus, I want to write code that will be clean and maintainable and wouldn't want to throw all use cases in one big consumer with a lot of mess.

A solution I've thought about is to write a new kafka-consumer for each use case and to assign each consumer a different groupId (within the same application). That way, each business logic use case will have its own class which will handle the events in its own special way. Seems tidy enough.

  1. May there arise any problems if I create too many consumer groups in one application?
  2. Is there a better solution that will allow me to keep clean and divide different business logic use cases?

Upvotes: 0

Views: 672

Answers (2)

Oswin Noetzelmann
Oswin Noetzelmann

Reputation: 9555

It sounds like you are on the right track by using separate consumer groups for different business logic use cases that will have separately managed offsets to track the individual progress. This will also align more with a microservice style architecture where different business cases may be implemented in different components.

One more consideration - And I cannot judge this just based on the information provided, but I would also think about splitting your topic into one per event type. It is not a problem for a consumer group to be subscribed to multiple topics at the same time. Whereas I believe it is less efficient to have consumers process/discard a large number of events that are irrelevant for them.

Upvotes: 1

OneCricketeer
OneCricketeer

Reputation: 191738

You can use Kafka Streams Processor API to consume and act on individual messages as well as window them within a specific, rolling/hopping time period

Upvotes: 0

Related Questions