mirzaD14
mirzaD14

Reputation: 43

Is it better to keep a Kafka Producer open or to create a new one for each message?

I have data coming in through RabbitMQ. The data is coming in constantly, multiple messages per second. I need to forward that data to Kafka.

In my RabbitMQ delivery callback where I am getting the data from RabbitMQ I have a Kafka producer that immediately sends the recevied messages to Kafka. My question is very simple. Is it better to create a Kafka producer outside of the callback method and use that one producer for all messages or should I create the producer inside the callback method and close it after the message is sent, which means that I am creating a new producer for each message?

It might be a naive question but I am new to Kafka and so far I did not find a definitive answer on the internet.

EDIT : I am using a Java Kafka client.

Upvotes: 4

Views: 3775

Answers (2)

louxiu
louxiu

Reputation: 2915

Kafka producer is stateful. It contains meta info(periodical synced from brokers), send message buffer etc. So create producer for each message is impracticable.

Upvotes: 1

ndogac
ndogac

Reputation: 1245

Creating a Kafka producer is an expensive operation, so using Kafka producer as a singleton will be a good practice considering performance and utilizing resources.

For Java clients, this is from the docs:

The producer is thread safe and should generally be shared among all threads for best performance.

For librdkafka based clients (confluent-dotnet, confluent-python etc.), I can link this related issue with this quote from the issue:

Yes, creating a singleton service like that is a good pattern. you definitely should not create a producer each time you want to produce a message - it is approximately 500,000 times less efficient.

Upvotes: 5

Related Questions