matthieu lieber
matthieu lieber

Reputation: 662

Error at runtime, sbt compilation passes

I have a piece of code that compiles (Scala + Spark 1.6) fine. I then run it (with Spark 1.6), but it complains about a 1.6 method not being there. What gives ??

simple.sbt:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Conjars" at "http://conjars.org/repo"
resolvers += "cljars" at "https://clojars.org/repo/"

mainClass in Compile := Some("Medtronic.Class")

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "1.7.2"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark" % "2.1.1"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "1.8.0"

Compilation:

$ sbt assembly
[info] Loading project definition from /Users/mlieber/projects/spark/test/project
[info] Set current project to Simple Project (in build file:/Users/mlieber/projects/spark/test/)
[info] Updating {file:/Users/mlieber/projects/spark/test/}test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] Scala version was updated by one of library dependencies:
[warn]  * org.scala-lang:scala-library:(2.10.4, 2.10.0) -> 2.10.5
[warn] To force scalaVersion, add the following:
[warn]  ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0
[warn] Run 'evicted' to see detailed eviction warnings
..

[info] Run completed in 257 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
..
[info] Including from cache: spark-core_2.10-1.6.0.jar
..
[info] Including from cache: spark-streaming_2.10-1.6.0.jar
..
[info] Assembly up to date: /Users/mlieber/projects/spark/test/target/scala-2.10/stream_test_1.0.jar
[success] Total time: 98 s, completed Jan 28, 2016 4:05:22 PM

I run with:

./app/spark-1.6.0-bin-hadoop2.6/bin/spark-submit --jars /Users/mlieber/app/elasticsearch-1.7.2/lib/elasticsearch-1.7.2.jar  --master local[4] --class "MyClass"    ./target/scala-2.10/stream_test_1.0.jar 

Compilation error:

    Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.streaming.dstream.PairDStreamFunctions.mapWithState(Lorg/apache/spark/streaming/StateSpec;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/MapWithStateDStream;    
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
..
    16/01/28 18:35:23 INFO SparkContext: Invoking stop() from shutdown hook

Upvotes: 1

Views: 518

Answers (1)

marcospereira
marcospereira

Reputation: 12214

Your project is suffering from Dependency Hell. What is happening is that SBT resolves the transitive dependencies by default and one of your dependencies (elasticsearch-spark) requires another version of spark-core. From your logs:

[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0

Looks like the version required by elasticsearch-spark is not binary compatible with the one being used by your project, then you get an error when your project runs.

There is no error at compile time because the code being compiled (aka your code) is compatible with the version resolved.

Here are some options about how to solve this:

  1. You can try to upgrade elasticsearch-spark to version 2.1.2 and see if it brings a more updated version of spark-core (which could be compatible with your project). Version 2.2.0-rc1 depends on spark-core 1.6.0 and an upgrade to this version will for sure fix the problem, but keep in mind that you will be using a release candidate version.
  2. You can try to downgrade spark-core and spark-streaming to version 1.4.1 (the one being used by elasticsearch-spark) and adapt your code where necessary.

Upvotes: 2

Related Questions