Haseeb U
Haseeb U

Reputation: 1

How to handle large messages in kafka via Producer side config?

Some messages from my kafka producer are larger than the default 1MB and are rejected by the broker.

I don't have access to the broker and consumer to make changes. So i have to make changes only at the producer level.

I am considering using snappy compression with batch.size to 5MB in my producer . Please let me know if this is the right option or any other way is available for my usecase. Thanks

Upvotes: 0

Views: 788

Answers (1)

ankita.gulati
ankita.gulati

Reputation: 929

Whenever a message is sent from producer, it is stored at kafka broker and consumed by the consumer. Producer -> Broker -> Consumer

So, if you want to send large messages from Kafka, you need to change the configuration setting for both broker and consumer. a) For Broker:

message.max.bytes=15728640 
replica.fetch.max.bytes=15728640

b) For Consumer:

fetch.message.max.bytes=15728640

But since you don't have access to broker and consumer, you can set the compression property at Producer or Topic level.

Set the compression property at topic level:

./bin/kafka-topics --create --zookeeper localhost:2181 --config compression.type=gzip --topic topic_name

or set the property compression.type = gzip in Kafka Producer Client API.

Upvotes: 2

Related Questions