sudhir
sudhir

Reputation: 1437

Hive Tables are created from spark but are not visible in hive

From spark using:

DataFrame.write().mode(SaveMode.Ignore).format("orc").saveAsTable("myTableName")

Table is getting saved I can see using below command's hadoop fs -ls /apps/hive/warehouse\test.db' where test is my database name

drwxr-xr-x - psudhir hdfs 0 2016-01-04 05:02 /apps/hive/warehouse/test.db/myTableName

but when I trying to check tables in Hive I cannot view them either with command SHOW TABLES from hiveContext.

Upvotes: 4

Views: 4577

Answers (4)

Erkan Şirin
Erkan Şirin

Reputation: 2095

Strangely, I had to lowercase all my spark dataframe column names, then I could see table content from Hive.

df = df.toDF(*[c.lower() for c in df.columns])

Upvotes: 0

Tanaji Sutar
Tanaji Sutar

Reputation: 119

sudo cp /etc/hive/conf.dist/hive-site.xml /etc/spark/conf/

This worked for me in a Cloudera quick start Virtual Box.

Upvotes: 2

lmtx
lmtx

Reputation: 5576

You have to copy the hive-site.xml file (mine is located at /etc/hive/conf.dist/hive-site.xml) to Spark conf folder (mine is located at /etc/spark/conf/)

sudo cp /etc/hive/conf.dist/hive-site.xml /etc/spark/conf/

Restart Spark and it should work.

Upvotes: 1

Simon McGloin
Simon McGloin

Reputation: 47

I think you need to run INVALIDATE METADATA; in the hive console to refresh the databases and view your new table.

Upvotes: -3

Related Questions