hermi zied
hermi zied

Reputation: 55

error execution spark with intellij

when i try to run spark with my intellij.

   Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1306)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
        at batch.BatchT$.main(BatchT.scala:15)
        at batch.BatchT.main(BatchT.scala)

I use spark 1.6, and i create a module with scala 2.12.04 support.

Upvotes: 0

Views: 798

Answers (1)

Assaf Mendelson
Assaf Mendelson

Reputation: 13001

The reason for this is that scala is not compatible between minor versions. Spark 1.6 is compiled with scala 2.10 by default (an option exists to manually compile it with scala 2.11). Spark 2.0+ is compiled with scala 2.11 by default.

No spark version is compiled with scala 2.12 currently (support for this is planned only for spark 3.0)

The easiest solution would be to downgrade your scala version to 2.10 (or 2.11 if you use a newer spark version such as 2.0+ or a specifically compiled spark 1.6).

Upvotes: 4

Related Questions