Reputation: 25
I have written a very simpel pipeline in Apchea Beam as follow to read data from my kafka cluster on Confluent Cloud as follow:
Pipeline pipeline = Pipeline.create(options);
Map<String, Object> propertyBuilder = new HashMap();
propertyBuilder.put("ssl.endpoint.identification.algorithm", "https");
propertyBuilder.put("sasl.mechanism","PLAIN");
propertyBuilder.put("request.timeout.ms","20000");
propertyBuilder.put("retry.backoff.ms","500");
pipeline
.apply(KafkaIO.<byte[], byte[]>readBytes()
.withBootstrapServers("pkc-epgnk.us-central1.gcp.confluent.cloud:9092")
.withTopic("gcp-ingestion-1")
.withKeyDeserializer(ByteArrayDeserializer.class)
.withValueDeserializer(ByteArrayDeserializer.class)
.updateConsumerProperties(propertyBuilder)
.withoutMetadata() // PCollection<KV<Long, String>>
) .apply(Values.<byte[]>create());
However, I get below excpetion when running above codes to read data from my kafka cluster
I run above on direct java runner, I am using beam 2.8,
I can read and produce messages to my kafka confluent cluster but not by above codes.
Upvotes: 1
Views: 1340
Reputation: 2539
If you follow the stack trace it appears that the code tries to cast the timeout configuration property to Integer
: https://github.com/apache/beam/blob/2e759fecf63d62d110f29265f9438128e3bdc8ab/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaUnboundedReader.java#L112
But instead it gets a string. My guess is that this is because you set it as string here: propertyBuilder.put("request.timeout.ms","20000")
. I assume the correct thing would be to set it as Integer
, e.g. like propertyBuilder.put("request.timeout.ms", 20000)
(no quotes around the timeout value).
You also may have similar issues with other configuration properties (e.g. retry backoff), you need to double check the property types.
Upvotes: 2