Reputation: 6354
I new to both Scala and SBT, and in an attempt to learn something new, am trying to run through the book "Building a recommendation engine with Scala". The example libraries referenced in the book have now been replaced by later versions or in some cases seemingly superseded by different techniques (casbah to Mongo Scala driver). This has led to me producing some potentially incorrect SBT build files. With my initial build file, I had;
name := "BuildingScalaRecommendationEngine"
scalaVersion := "2.12.1"
version :="1.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.1.1"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.1.0"
libraryDependencies += "org.apache.kafka" % "kafka_2.12" % "0.10.2.0"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.1"
This was leading to a build error like;
[error] Modules were resolved with conflicting cross-version suffixes in {file:/C:/Dev/learning/scala/Tutorial/src/}src:
[error] org.scala-lang.modules:scala-xml _2.11, _2.12
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml
I tried following some examples as suggested at this link which I found at some Gitter conversation here. The suggestions were all a little beyond my understanding at this point though.
I managed to work around the error through mostly trial and error via amending my build file to look like the following;
name := "BuildingScalaRecommendationEngine"
scalaVersion := "2.12.1"
version :="1.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" excludeAll(
ExclusionRule(organization = "org.scala-lang.modules")
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.1.1" excludeAll(
ExclusionRule(organization = "org.scala-lang.modules")
)
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.1.0" excludeAll(
ExclusionRule(organization = "org.scala-lang.modules")
)
libraryDependencies += "org.apache.kafka" % "kafka_2.12" % "0.10.2.0" excludeAll(
ExclusionRule(organization = "org.scala-lang.modules")
)
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.1" excludeAll(
ExclusionRule(organization = "org.scala-lang.modules")
)
Is there something I could have done with my original build file to get around this error?
Is there a combination of library dependency values that is causing this in the first place?
I aim to get a better understanding of Scala and SBT to overcome this, but it is somewhat frustrating in the meantime.
Upvotes: 3
Views: 5783
Reputation: 469
This is version conflict exception
Here you are using spark libraries with scala 2.11 "spark-streaming_2.11" so change the scala version from 2.12.1 to 2.11.X to avoid this version conflict exception.
Use following type of sbt
name := "BuildingScalaRecommendationEngine"
scalaVersion := "2.11.6"
version :="1.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.1.1"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.1.0"
libraryDependencies += "org.apache.kafka" % "kafka_2.12" % "0.10.2.0"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.1"
Upvotes: 0
Reputation: 128131
tl;dr: you cannot use Scala 2.12 because Spark does not support it yet and you also need to use %%
when specifying dependencies to avoid problems with incorrect binary versions. Read below for more explanation.
Scala versions like 2.x are binary incompatible, therefore all libraries have to be compiled separately for each such release (2.10, 2.11 and 2.12 being the currently used ones, although 2.10 is on its route to being legacy). That's what _2.12
an _2.11
suffixes are about.
Naturally, you cannot use libraries compiled for a different version of Scala than the one you're currently using. So if you set your scalaVersion
to, say, 2.12.1
, you cannot use libraries with names suffixed by _2.11
. This is why it is possible to write either "groupName" % "artifactName"
and "groupName" %% "artifactName"
: in the latter case, when you use double percent sign, the current Scala binary version will be appended to the name automatically:
scalaVersion := "2.12"
"groupName" %% "artifactName" % "version"
==
"groupName" % "artifactName_2.12" % "version"
So, in 99% of cases, you want to set your Scala version once and then use the %%
operator for specifying Scala libraries.
In your case, however, the problem is that you want to use Scala 2.12, but you're trying to pull Spark compiled for Scala 2.11. In an ideal world the proper solution would be to use %%
for all your dependencies, so their 2.12-compatible artifacts would be used, however, Spark artifacts are not yet published for Scala 2.12 (some issues prevent Spark from being stable on 2.12, as far as I remember). Therefore, you should change your Scala version to 2.11.11 (the last Scala version in the 2.11 branch), and then get rid of the _2.11
/_2.12
suffixes in your artifact names and use %%
instead. Then it should work.
Upvotes: 5