avy
avy

Reputation: 437

Missing HiveContext Dependency

I am trying this :

val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)

This is my build.sbt file :

name := "SPARK-SQL"

version := "1.0"

scalaVersion := "2.11.8"


libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0-preview"


libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.0.0-preview"

libraryDependencies += "org.apache.spark" % "spark-hive-thriftserver_2.10" % "1.6.2"

The Error i am getting :

Error:scalac: missing or invalid dependency detected while loading class file 'HiveContext.class'.

Upvotes: 1

Views: 1111

Answers (3)

ruloweb
ruloweb

Reputation: 754

Looks like you are trying to use the spark-hive library 1.6 with the spark library 2.0, I'm not sure if that works ok.

For Spark 1.6.x you can do:

libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.6.3" % "provided"

Upvotes: 0

Sandeep Purohit
Sandeep Purohit

Reputation: 3692

You can simply use spark session with hive support in spark 2.0 as below so you simply need spark sql dependency example

val spark = SparkSession
  .builder()
  .appName("Spark Hive Example")
  .enableHiveSupport()

Upvotes: 1

Shubham Rajput
Shubham Rajput

Reputation: 1185

To work with org.apache.spark.sql.hive.HiveContext, you need to have following configuration combination in your build.sbt or pom.xml and switch to scala 2.10.6 because of the compatibility issue of spark-hive_2.10:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.10</artifactId>
    <version>2.0.0</version>
</dependency> 

Upvotes: 0

Related Questions