Reputation: 6006
I'm trying to create an external table in Hive, but keep getting the following error:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Aborting command set because "force" is false and command failed: "create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";"
The contents of /tmp/hive_test_1375711405.45852.txt
are:
abc\tdef
I'm connecting via the beeline
command line interface, which uses Thrift HiveServer2
.
System:
Upvotes: 3
Views: 18017
Reputation: 1
We faced similar problem in our company (Sentry, hive, and kerberos combination). We solved it by removing all privileges from non fully defined hdfs_url
. For example, we changed GRANT ALL ON URI '/user/test' TO ROLE test;
to GRANT ALL ON URI 'hdfs-ha-name:///user/test' TO ROLE test;
.
You can find the privileges for a specific URI in the Hive database (mysql in our case).
Upvotes: -1
Reputation: 6006
The issue was that I was pointing the external table at a file in HDFS instead of a directory. The cryptic Hive error message really threw me off.
The solution is to create a directory and put the data file in there. To fix this for the above example, you'd create a directory under /tmp/foobar
and place hive_test_1375711405.45852.txt
in it. Then create the table like so:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/foobar";
Upvotes: 3