Steven Park
Steven Park

Reputation: 377

java.lang.LinkageError in Spark Streaming

I am using Spark 2.2 on CDH 5.10 cluster with Scala 2.11.8. Everything was working fine but then I suddenly started getting this in the Driver code:

     Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method 
    "org.apache.spark.streaming.StreamingContext$.getOrCreate(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;" 
    the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, com/hp/hawkeye/driver/StreamingDriver$, 
    and the class loader (instance of sun/misc/Launcher$AppClassLoader)
 for the method's defining class, org/apache/spark/streaming/StreamingContext$, 
have different Class objects for the type scala/Function0 used in the signature

Any ideas how I can fix this?

Upvotes: 1

Views: 2160

Answers (1)

Steven Park
Steven Park

Reputation: 377

Figured out the solution - there was a class loader conflict which was because of manually placing a dependency jar on the cluster. These helped :

rm -rf ~/.sbt
rm -rf ~/.ivy2/cache

Then restarted IDEA. Spark-submit on cluster was fine. But placing an extra dependent jar in lib (spark-avro-assembly-4.0.0-snapshot) brought back this issue. Somehow that jar which has a fix for Spark 2.2 with spark-avro 3.2 was creating the problem.

Upvotes: 1

Related Questions