Reputation: 23
When I am trying to publish through Kafka producer
in Spring boot application, I am getting error of RecordTooLargeException
.
Error is :
org.apache.kafka.common.errors.RecordTooLargeException: The message is 1235934 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
I read other discussions about this problem but did not get any suitable support for this as I have to publish also as well as consume that message from client side.
Please help me by giving a brief configuration steps to do this.
Upvotes: 2
Views: 2332
Reputation: 3814
Nice thing about Kafka is that is has great exception messages that are pretty much self explanatory. It is basically saying that your message is too large (which you have concluded by yourself, I believe).
If you check the docs for producer config search for the max.request.size
in the table for an explanation, it says:
The maximum size of a request in bytes. This setting will limit the number of record batches the producer will send in a single request to avoid sending huge requests. This is also effectively a cap on the maximum record batch size. Note that the server has its own cap on record batch size which may be different from this.
You can configure this value in your producer configuration, like so:
properties.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "value-in-bytes");
However, the default is pretty much good for 90% use cases. If you can avoid sending such large messages or perhaps try to compress the messages (this will work wonders when talking about throughput), like so:
properties.setProperty(ProducerConfig.COMPRESSION_TYPE_CONFIG, "snappy");
There are 2 other compression types but this one is from Google and is pretty efficient. Along with compression, you can tweak 2 other values to get much better performance (batch.size
and linger.ms
) but you would have to test for your use case.
Upvotes: 4