vaibhav
vaibhav

Reputation: 4103

Kafka Streams Custom processing

There is a requirement for me to process huge files, there could be multiple files that we may end up processing in parallel.

One option that i have thought of is each message pushed to the broker will have: the row data + rule to be applied + some co relation ID(would be like an identifier for that particular file)

I plan to use kafka streams and create a topology with a processor which will get the rule with message process it and sink it.

However (I am new to kafka streams hence may be wrong):

Upvotes: 0

Views: 214

Answers (1)

Kiran Balakrishnan
Kiran Balakrishnan

Reputation: 257

i guess you can set a key and value record aside that could be sent to the topics at the end of the file which would signify the closure of the file. Say the record has a unique key such as -1 which signifies that the eof

Upvotes: 1

Related Questions