Abhinav Atul
Abhinav Atul

Reputation: 637

kafka batch processing, ack granularity and dead letter queue

Does kafka support(or plans to) ack granularity in Batch processing with dead letter queue configured.

  1. Say the batch size is 100
  2. Consumer reads 100 log records in a Batch
  3. Processes all of them successfully except for records with Batch order 20, 51, 99

Does kafka allow sending a compressed ack of the form (19,-1,30,-1,48,-1,1), so that messages with Batch index 20, 51 and 99 are published to the Deadletter queue.

Upvotes: 4

Views: 795

Answers (1)

No. You have to send complete batch again.

If you want to get ack per message, you have to send them one by one, what might be inefficient.

And also you have to take care that either autocommit happens after processing or you have to change Kafka consumer in the way that you will send commits manually.

Dependently on statistics, how often some processing is broken, I would suggest:

  1. For very seldom processing errors not related to the message, but to e.g. network problem; I would send batches.
  2. For often happening logical errors in the messages; I would program small validator, which will re-publish broken messages in dead letter queue.
  3. If you can somehow patch broken messages; I would pack broken message in proper default message envelope and send for further processing with status, that original message was broken and further processing shall handle this.

Upvotes: 1

Related Questions