Jaffer Wilson
Jaffer Wilson

Reputation: 7273

Apache Spark giving error in IntelliJ

I have written a code for SparkSQL. I ran it using Eclipse Neon. It was working fine. When I shifted to IntelliJ Idea, the same code is not running.
Here is the code:

SparkSession spark = SparkSession
            .builder()
            .appName("Java Spark Hive Example")
            .master("local[*]")
            .config("hive.metastore.uris", "thrift://localhost:9083")
            .enableHiveSupport()
            .getOrCreate();
spark.sql("select survey_response_value from health").show();

The exceptions I am getting are:
https://justpaste.it/13tsa
Kindly let me know why I am facing this problem. How I can resolve this? Should I write the code from the start using IntelliJ or is there any mechanism for help?

Upvotes: 1

Views: 685

Answers (1)

sgvd
sgvd

Reputation: 3939

The problem is this exception:

com.fasterxml.jackson.databind.JsonMappingException: Jackson version is too old 2.5.1

Basically, Spark uses Jackson version 2.6.5, but there is another dependency you use that puts an older version (2.5.1) of Jackson on the classpath. It is difficult to say why it's different between Eclipse and IntelliJ, but my guess is that they build up the classpath differently, so that with Eclipse the correct version is placed first.

There are several questions about this error on SO, eg:

For me, using Spark 2.x, adding the following to your build.sbt ensure the correct version is used:

dependencyOverrides ++= Set(
  "com.fasterxml.jackson.core" % "jackson-annotations" % "2.6.5",
  "com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
  "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"
)

Upvotes: 5

Related Questions