manchitro
manchitro

Reputation: 131

How to put events in a Kafka Stream after they have been consumed?

I have a Spring Boot application which consumes from a few Kafka Topics for further processing and eventually push in a DB. It also has Kafka Streams which filter specific types of events from each topic and count how many of each specific type of event has been produced.

I also need to know how many of these events (of each specific types) has been consumed by my application. The Streams work perfectly in counting how many events are in a topic (aka has been produced) but I need the same functionality for consumed messages too. How would I go about implementing something like that?

Upvotes: 0

Views: 880

Answers (1)

groo
groo

Reputation: 4438

As you're using Spring Boot I would recommend you to enable Micrometer which will automatically collect the client metrics and enable you to easily integrate into your preferred Monitoring infra as it supports many of the most used ones.

The count is a metric that is already there so you can easily expose a metric and create a visualization chart with the count, calculate the rate, etc. Including getting insights per date ranges.

Spring kafka monitoring

Kafka client metrics.

Upvotes: 1

Related Questions