annedroiid
annedroiid

Reputation: 6667

Apache KafkaProducer throwing TimeoutException when sending a message

I have a KafkaProducer that has suddenly started throwing TimeoutExceptions when I try to send a message. Even though I have set the max.block.ms property to 60000ms, and the test blocks for 60s, the error message I am getting always has a time of less than 200ms. The only time it actually shows 60000ms is if I run it in debug mode and step through the waitOnMetadata method manually.

error example:
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 101 ms.

Does anyone know why it would suddenly not be able to update the metadata? I know it's not my implementation of the producer that is faulty, as not only have I not changed it since it was working, if I run my tests on another server they all pass. What server side reasons could there be for this? Should I restart my brokers? And why would the timeout message show an incorrect time if I just let it run?

Producer setup:

val props = new Properties()
props.put("bootstrap.servers", getBootstrapServersFor(datacenter.mesosLocal))
props.put("batch.size","0")
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
props.put("max.block.ms","60000")
new KafkaProducer[String,String](props)

Upvotes: 0

Views: 660

Answers (1)

annedroiid
annedroiid

Reputation: 6667

I tried to use the console producer to see if I could send messages and I got a lot of WARN Error while fetching metadata with correlation id 0 : {metadata-1=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient) message back. After stopping and restarting the broker I was then able to send and consume messages again.

Upvotes: 1

Related Questions