Reputation: 209
I am running into the below exception when publishing an event to the kafka topic
org.apache.kafka.common.errors.RecordTooLargeException: The message is 2063239 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
I looked at https://github.com/spring-cloud/spring-cloud-stream/issues/587
I am using the below spring cloud versions and did not provide any version to spring-cloud-stream-kafka-binder
in my pom
<spring-cloud-stream.version>Brooklyn.SR3</spring-cloud-stream.version>
<spring-cloud.version>Brixton.SR7</spring-cloud.version>
what spring-cloud-stream version jar should I use in order to use this property?
Upvotes: 0
Views: 715
Reputation: 174739
With Brooklyn (binder 1.1.2), use the general configuration
property map...
spring.cloud.stream.kafka.binder.configuration.max.request.size=
for a global setting or
spring.cloud.stream.kafka.bindings.<dest>.producer.configuration.max.request.size=
for a specific destination (e.g. output
).
Upvotes: 3