Reputation: 457
I have been trying to configure one Kafka broker, one topic, one producer, one consumer. when producer produces , if the Broker goes down, loss of data happens, eg:
In Buffer:
Datum 1 - published
Datum 2 - published
.
. ---->(Broker goes down for a while and reconnects...)
.
Datum 4 - published
Datum 5 - published
Properties Configured for Producer are:
bootstrap.servers=localhost:9092
acks=all
retries=1
batch.size=16384
linger.ms=2
buffer.memory=33554432
key.serializer=org.apache.kafka.common.serialization.IntegerSerializer
value.serializer=org.apache.kafka.common.serialization.StringSerializer
producer.type=sync
buffer.size=102400
reconnect.interval=30000
request.required.acks=1
The data size lesser than the configured buffer size.. Help me know where I am going wrong...!
Upvotes: 0
Views: 2559
Reputation: 62285
Not sure what you exactly do. I would assume that the messages you try to write to Kafka while broker is down are not acked by Kafka. If a message is not acked, it indicates that the message was not written to Kafka and producer needs to re-try to write the message.
The easiest way to do this, is by setting the configuration parameters retries
and retry.backoff.ms
accordingly.
At application level, you can also register a Callback
in send(..., Callback)
to get informed about success/failure. In case of failure, you could retry sending by calling send()
again.
Upvotes: 2