Björn Jacobs
Björn Jacobs

Reputation: 4262

Using remote standalone cluster in local Spark application error "scala.Option; local class incompatible"

I set up a remote Spark standalone cluster with several nodes. For that I downloaded this Spark release: spark-1.5.1-bin-hadoop1 (because we have an old Hadoop 1 service running)

In my local Scala program I defined the path to the master:

val conf = new SparkConf().setMaster("spark://node01.kdlan:7078").setAppName("My App")
val sc = new SparkContext(conf)

When starting my local local program I see this error in the log:

16/01/15 10:35:34 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://[email protected]:7078] has failed, address is now gated for [5000] ms. Reason: [Disassociated] 

On the the remote master I can see in the Spark log files:

16/01/15 10:35:34 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@myip:43288] has failed, address is now gated for [5000] ms. Reason is: [scala.Option; local class incompatible: stream classdesc serialVersionUID = -114498752079829388, local class serialVersionUID = -2062608324514658839].
16/01/15 10:35:34 INFO Master: myip:43288 got disassociated, removing it.

Somewhere I read that the Spark dependencies should be marked as 'provided' which I did in my build.sbt. There are no other library dependencies defined.

scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.1" % "provided"

The Spark version matches (1.5.1) and the Scala main version too (2.11), so where does the incompatible Scala Option class come from and how to solve this?

Upvotes: 0

Views: 98

Answers (1)

Björn Jacobs
Björn Jacobs

Reputation: 4262

Credit for this answer goes to mark91.

The reason why this error occurs is, that the pre-compiled Spark distributions that you can download from the Spark homepage are compiled with Scala 2.10 - thus the version missmatch on scala.Option class.

If you want to use Scala 2.11, then you have to compile it yourself. There is a source-code package on the Spark homepage that you can download for that. There is info how to compile it as well.

There is this note on the Spark homepage, however I missed this.

Note: Scala 2.11 users should download the Spark source package and build with Scala 2.11 support.

Upvotes: 1

Related Questions