Jds
Jds

Reputation: 155

Need to access Hive table using database qualifier from spark

I am able to access the hive table from spark-shell but receiving the exception specified below while submitting it as part of the job:

val df = sqlContext.table("dbName.tableName")

Exception in thread "main" org.apache.spark.sql.AnalysisException: Specifying database name or other qualifiers are not allowed for temporary tables. If the table name has dots (.) in it, please quote the table name with backticks (`).;

Please let me know how can I resolve this.

Upvotes: 3

Views: 7053

Answers (1)

eliasah
eliasah

Reputation: 40360

You can't do that from sqlContext, you'll need to define a HiveContext for that as followed :

val hiveContext = new HiveContext(sc)
import hiveContext.implicits._
import hiveContext.sql

A hive context adds support for finding tables in the MetaStore and writing queries using HiveQL. Users who do not have an existing Hive deployment can still create a HiveContext.

When not configured by the hive-site.xml, the context automatically creates metastore_db and warehouse in the current directory.

Once you have defined the HiveContext, you can express queries in HiveQL.

Upvotes: 11

Related Questions