Pedro Alves
Pedro Alves

Reputation: 1054

Python - Spark - HiveContext - Can't find tables

I'm using a VM with Spark 1.6.0 and I'm tryin to create a dataframe with dat from Hive.

I've 2 tables in default databases but when I'm trying to show all the tables using Spark I don't get any results:

sqlContext.sql("show tables").collect()
[]
sqlContext.sql("show databases").collect()
[Row(result=u'default')]

My Hive tables was created directly on Hive on Default database:

CREATE TABLE team
(id INT, sports STRING, players INT)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ","
LOCATION "/user/cloudera/data/sports";

Why I cannot see my table?

Thanks!

Upvotes: 0

Views: 867

Answers (1)

Harika Krishna
Harika Krishna

Reputation: 36

you should copy hive-site.xml file to spark conf directory.

Upvotes: 1

Related Questions