Reputation: 2002
I want to create/access the hive tables from spark.
I have placed the hive-site.xml inside the spark/conf directory. Even though it creates a local metastore in the directory where I run the spark shell and exists with an error.
I am getting this error when I try to create a new hive table.
sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
Please suggest a way to resolve this
15/02/12 10:35:58 ERROR RetryingHMSHandler: MetaException(message:file:/user/hive/warehouse/src is not a directory or unable to create one)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Upvotes: 2
Views: 3599
Reputation: 652
I experienced similar error while running Spark SQL on hive, and it turned out that the running user (of mac) for Spark SQL does not have write permission to mac directory /user/hive/warehouse, which Spark/Hive was trying to create somehow (not sure why though, since my metastore is on mysql, and data file in hdfs). The error was gone after I use "sudo" to start Spark shell, i.e.,
bin> sudo ./spark-shell
Upvotes: 1
Reputation: 21
I have encountered the same problem, and solved it as follows:
add the hive conf directory to spark-env.sh SPARK_CLASSPATH=/opt/apache-hive-0.13.1-bin/conf
edit hdfs-site in hive conf directory by adding "hdfs://master:8020" to hive.metastore.warehouse.dir. For example : hdfs://master:8020/user/hive/warehouse
Upvotes: 2