Aviral Kumar
Aviral Kumar

Reputation: 824

Why does executing SQL against Hive table using SQLContext in application fail (but the same query in spark-shell works fine)?

I am using Spark 1.6.

I am trying to connect to a table in my spark-sql java code by :

JavaSparkContext js = new JavaSparkContext();
SQLContext sc = new SQLContext(js); 

DataFrame mainFile = sc.sql("Select * from db.table");

It gives me a table not found exception.

But when I do that in spark-shell using scala, it works fine.The table gets accessed and I can print out the data also.

Any inputs on this issue?

Upvotes: 1

Views: 245

Answers (1)

abaghel
abaghel

Reputation: 15297

Spark-shell provides HiveContext. If you want to use HiveContext in Java code then add dependency for that in your application and then use in java program. Please refer http://spark.apache.org/docs/1.6.2/sql-programming-guide.html#hive-tables

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.6.2</version>
</dependency>

Upvotes: 3

Related Questions