yutechnet
yutechnet

Reputation: 171

Hive failed to create /user/hive/warehouse

I just get started on Apache Hive, and I am using my local Ubuntu box 12.04, with Hive 0.10.0 and Hadoop 1.1.2.

Following the official "Getting Started" guide on Apache website, I am now stuck at the Hadoop command to create the hive metastore with the command in the guide:

$ $HADOOP_HOME/bin/hadoop fs -mkdir       /user/hive/warehouse

the error was mkdir: failed to create /user/hive/warehouse

Does Hive require hadoop in a specific mode? I know I didn't have to do much to my Hadoop installation other that update JAVA_HOME so it is in standalone mode. I am sure Hadoop itself is working since I am run the PI example that comes with hadoop installation.

Also, the other command to create /tmp shows the /tmp directory already exists so it didn't recreate, and /bin/hadoop fs -ls is listing the current directory.

So, how can I get around it?

Upvotes: 11

Views: 27515

Answers (10)

Anukriti P.
Anukriti P.

Reputation: 1

I am using MacOS and homebrew as package manager. I had to set the property in hive-site.xml as

<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/usr/local/Cellar/hive/2.3.1/libexec/conf/warehouse</value>
</property>

Upvotes: 0

vmorusu
vmorusu

Reputation: 936

Adding answer for ref to Cloudera CDH users who are seeing this same issue.

If you are using Cloudera CDH distribution, make sure you have followed these steps:

  • launched Cloudera Manager (Express / Enterprise) by clicking on the desktop icon.
  • Open Cloudera Manager page in browser
  • Start all services

Cloudera has /user/hive/warehouse folder created by default. Its just that YARN and HDFS might not be up and running to access this path.

Upvotes: 1

Steven Lowenthal
Steven Lowenthal

Reputation: 656

When setting hadoop properties in the spark configuration, prefix them with spark.hadoop.

Therefore set

conf.set("spark.hadoop.hive.metastore.warehouse.dir","/new/location")

This works for older versions of Spark. The property has changed in spark 2.0.0

Upvotes: 1

Ray Teale
Ray Teale

Reputation: 221

Almost all examples of the documentation have this command wrong. Just like unix you will need the "-p" flag to create the parent directories as well unless you have already created them. This command will work.

$HADOOP_HOME/bin/hadoop fs -mkdir -p    /user/hive/warehouse

Upvotes: 22

Karthik Sridhar
Karthik Sridhar

Reputation: 259

  • Run this command and try to create a directory it would grant full permission for the user in hdfs /user directory. hadoop fs -chmod -R 755 /user

Upvotes: 0

HakkiBuyukcengiz
HakkiBuyukcengiz

Reputation: 419

I recommend using upper versions of hive i.e. 1.1.0 version, 0.10.0 is very buggy.

Upvotes: 0

Elan Hershcovitz
Elan Hershcovitz

Reputation: 31

if you r running linux check (in hadoop core-site.xml ) data directory & permission, it looks like you ve kept the default which is /data/tmp and im most cases that will take root permission .. change the xml config file , delete /data/tmp and run fs format (OC after you ve modified the core xml config)

Upvotes: 0

wonder.mice
wonder.mice

Reputation: 7563

When running hive on local system, just add to ~/.hiverc:

SET hive.metastore.warehouse.dir=${env:HOME}/Documents/hive-warehouse;

You can specify any folder to use as a warehouse. Obviously, any other hive configuration method will do (hive-site.xml or hive -hiveconf, for example).

That's possibly what Ambarish Hazarnis kept in mind when saying "or Create the warehouse in your home directory".

Upvotes: 6

yutechnet
yutechnet

Reputation: 171

While this is a simple permission issue that was resolved with sudo in my comment above, there are a couple of notes:

  1. create it in home directory should work as well, but then you may need to update hive setting for the path of metastore, which I think defaults to /user/hive/warehouse

  2. I ran into another error of CREATE TABLE statement with Hive shell, the error was something like this:

hive> CREATE TABLE pokes (foo INT, bar STRING); FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/pokes does not exist.) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

It turns to be another permission issue, you have to create a group called "hive" and then add the current user to that group and change ownership of /user/hive/warehouse to that group. After that, it works. Details can be found from this link below:

http://mail-archives.apache.org/mod_mbox/hive-user/201104.mbox/%[email protected]%3E

Upvotes: 0

Ambarish Hazarnis
Ambarish Hazarnis

Reputation: 186

This seems like a permission issue. Do you have access to root folder / ? Try the following options-

1. Run command as superuser

OR

2.Create the warehouse in your home directory. 

Let us know if this helps. Good luck!

Upvotes: 3

Related Questions