Kumar Harsh
Kumar Harsh

Reputation: 453

Getting dependency error for sparksession and SQLContext

I am getting dependency error for my SQLContext and sparksession in my spark program

val sqlContext = new SQLContext(sc)
val spark = SparkSession.builder()

Error for SQLCOntext

Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class org.apache.spark.sql.SQLContext'. Make sure that type Logging is in your classpath and check for conflicting dependencies with -Ylog-classpath. A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.

Error for SparkSession:

not found: value SparkSession

Below are the spark dependencies in my pom.xml

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-catalyst_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-test-tags_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>

Upvotes: 0

Views: 775

Answers (1)

cheseaux
cheseaux

Reputation: 5325

You can't have both Spark 2 and Spark 1.6 dependencies defined in your project. org.apache.spark.Logging is not available in Spark 2 anymore.

Change

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>

to

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>

Upvotes: 1

Related Questions