Luckylukee
Luckylukee

Reputation: 595

Consuming Kafka messages by two separate applications (storm and spark streaming)

We have a developed an ingestion application using Storm which consume Kafka messages (some times series sensor data) and save those messages into Cassandra. We use a Nifi workflow to do this.

I am now going to develop a separate Spark Streaming application which need to consume those Kafka messages as a source. I wonder why if there would be a problem when two application interacting with one Kafka Chanel? Should I duplicate Kafka messages in the Nifi to another Chanel so my Spark Streaming application use them, this is an overhead though.

Upvotes: 0

Views: 50

Answers (1)

streetturtle
streetturtle

Reputation: 5850

From Kafka documentation:

If all the consumer instances have different consumer groups, then each record will be broadcast to all the consumer processes.

Which in your case means that your second application just have to use another consumer group, so that these two applications will get same messages.

Upvotes: 1

Related Questions