Mukund S
Mukund S

Reputation: 95

Error while using SparkSession or sqlcontext

I am new to spark. I am just trying to parse a json file using sparksession or sqlcontext. But whenever I run them, I am getting the following error.

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.internal.config.package$.CATALOG_IMPLEMENTATION()Lorg/apache/spark/internal/config/ConfigEntry; at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:930) at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:112) at
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:110)  at 
org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:535)  at 
org.apache.spark.sql.SparkSession.read(SparkSession.scala:595) at
org.apache.spark.sql.SQLContext.read(SQLContext.scala:504) at
joinAssetsAndAd$.main(joinAssetsAndAd.scala:21) at
joinAssetsAndAd.main(joinAssetsAndAd.scala)

As of now I created a scala project in eclipse IDE and configured it as Maven project and added the spark and sql dependencies.

My dependencies :

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
</dependencies>

Could you please explain why I am getting this error and how to correct them?

Upvotes: 3

Views: 10462

Answers (1)

L. CWI
L. CWI

Reputation: 962

Try to use the same version for spark-core and spark-sql. Change version of spark-sql to 2.1.0

Upvotes: 18

Related Questions