Naman Agarwal
Naman Agarwal

Reputation: 654

Spark 2.1.0 connection with Kafka 0.9.0

I am using Kafka 0.9.0 and Spark 2.1.0 My Spark Submit is as follows:

./spark-submit --jars /home/cnbo/jars/spark-sql-kafka-0-10_2.11-2.1.0.cloudera1.jar --class ClickStream /home/cnbo/jars/sparkstreamingfi_2.11-0.1.jar

I am getting following error:

java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.subscribe(Ljava/util/Collection;)V

I know there is version conflict for 0.9 to 0.10. But I want to run My application on Kafka 0.9.0. What needs to be done now? what external jar I should use instead of:

spark-sql-kafka-0-10_2.11-2.1.0.cloudera1.jar

Thanks in Advance!!

Upvotes: 1

Views: 518

Answers (2)

OneCricketeer
OneCricketeer

Reputation: 191681

If you read the Spark Kafka page, you'd have seen

0.8 integration is compatible with later 0.9 and 0.10 brokers, but the 0.10 integration is not compatible with earlier brokers

Therefore, you need this library

spark-streaming-kafka-0-8_2.11

And if you want structured streaming, you need to actually upgrade Kafka to support the new consumer API

Upvotes: 1

himanshuIIITian
himanshuIIITian

Reputation: 6085

In order to use Spark along with Kafka, you need to add following JAR too in spark.driver.extraClassPath & spark.executor.extraClassPath:

/home/cnbo/jars/kafka-clients-0.9.0.0.jar

Since spark-sql-kafka-0-10_2.11-2.1.0.cloudera1.jar does not contain KafkaConsumer, we need to add above-mentioned JAR file too. So, the final spark-submit command will be like this:

./spark-submit --jars /home/cnbo/jars/spark-sql-kafka-0-10_2.11-2.1.0.cloudera1.jar:/home/cnbo/jars/kafka-clients-0.9.0.0.jar --class ClickStream /home/cnbo/jars/sparkstreamingfi_2.11-0.1.jar

I hope it helps!

Upvotes: 2

Related Questions