mrmannione
mrmannione

Reputation: 819

Kafka Consumer API vs Streams API for event filtering

Should I use the Kafka Consumer API or the Kafka Streams API for this use case? I have a topic with a number of consumer groups consuming off it. This topic contains one type of event which is a JSON message with a type field buried internally. Some messages will be consumed by some consumer groups and not by others, one consumer group will probably not be consuming many messages at all.

My question is: Should I use the consumer API, then on each event read the type field and drop or process the event based on the type field.

OR, should I filter using the Streams API, filter method and predicate?

After I consume an event, the plan is to process that event (DB delete, update, or other depending on the service) then if there is a failure I will produce to a separate queue which I will re-process later.

Thanks you.

Upvotes: 1

Views: 628

Answers (1)

Daniel Hinojosa
Daniel Hinojosa

Reputation: 992

This seems more a matter of opinion. I personally would go with Streams/KSQL, likely smaller code that you would have to maintain. You can have another intermediary topic that contains the cleaned up data that you can then attach a Connect sink, other consumers, or other Stream and KSQL processes. Using streams you can scale a single application on different machines, you can store state, have standby replicas and more, which would be a PITA to do it all yourself.

Upvotes: 2

Related Questions