Eric
Eric

Reputation: 2705

Why does KafkaUtils.createDirectStream throw a NoSuchMethodError?

HashSet<String> topicsSet = new HashSet<String>(Arrays.asList(config.getKafkaTopics().split(",")));
HashMap<String, String> kafkaParams = new HashMap<String, String>();
kafkaParams.put("metadata.broker.list", config.getKafkaBrokers());

// Create direct KAFKA stream with brokers and topics
JavaPairInputDStream<String, String> messages = KafkaUtils.createDirectStream(jssc, String.class, String.class,
        StringDecoder.class, StringDecoder.class, kafkaParams, topicsSet);

I am creating a Kafka stream using createDirectStream function from KafkaUtils, as shown above. I think it's pretty standard, and I think it worked with Spark-1.5.1.

I switched to Spark-1.6.1, and although I am not sure if this is because of the version, it throws the following error:

Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at kafka.api.RequestKeys$.<init>(RequestKeys.scala:39)
    at kafka.api.RequestKeys$.<clinit>(RequestKeys.scala)
    at kafka.api.TopicMetadataRequest.<init>(TopicMetadataRequest.scala:53)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:122)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:112)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:211)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
    at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
    at com.analytics.kafka.consumer.SystemUserAnalyticsConsumer.main(SystemUserAnalyticsConsumer.java:59)
    ... 6 more

This gives very little information about what exactly the problem is.

What is the problem here?

Upvotes: 2

Views: 942

Answers (1)

Jacek Laskowski
Jacek Laskowski

Reputation: 74779

You're using incompatible versions of spark-streaming-kafka-0-10 for Scala versions used at compile/build time and runtime, i.e.

libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.0.1"

Make sure Scala versions are alike (and note the two percent signs that take care of it and rely on scalaVersion).

Upvotes: 2

Related Questions