Ali
Ali

Reputation: 1879

Replay messages from dead letter queue in Spring Cloud Stream with Kafka binder

We are using Spring Cloud Stream with Confluent Schema Registry, Avro and Kafka binder. We have configured all our services in the data processing pipeline to use a shared DLQ Kafka topic to simplify the process of exception handling and be able to replay failed messages. However, it looks like that for some reason we are not able to properly extract payload messages as messages with different schemas are published to a single dlq. Hence, we are losing the track of schema of the original message.

I was wondering if there is any way we could maintain the original schema_id of the failed messages in dlq so that it can be used for the purpose of seamless replay.

Upvotes: 1

Views: 803

Answers (1)

Ali
Ali

Reputation: 1879

It turns out by changing the Subject Naming Strategy to be RecordNameStrategy this can be achieved and regardless of the topic name, a record maintains the original schema across all the topics. More details can be found here.

Upvotes: 1

Related Questions