user2286963
user2286963

Reputation: 135

How to update Kafka consumer max.request.size config while using Spark structured stream

Spark readStream for Kafka fails with the following errors:

org.apache.kafka.common.errors.RecordTooLargeException (The message is 1166569 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.)

How do we bump up the max.request.size?

Code:

val ctxdb = spark
  .readStream
  .format("kafka")
  .option("kafka.bootstrap.servers", "ip:port")
  .option("subscribe","topic")
  .option("startingOffsets", "earliest")
  .option(" failOnDataLoss", "false")
  .option("max.request.size", "15728640")

We have tried to update option("max.partition.fetch.bytes", "15728640") with no luck.

Upvotes: 3

Views: 6877

Answers (1)

Yuval Itzchakov
Yuval Itzchakov

Reputation: 149518

You need to add the kafka prefix to the writer stream setting:

.option("kafka.max.request.size", "15728640")

Upvotes: 4

Related Questions