Reputation: 41
I can create a table atop an avro file using the following syntax without any errors. It is an empty table at firs.
CREATE EXTERNAL TABLE tableName
PARTITIONED BY (ingestiondatetime BIGINT, recordtype STRING)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serd2.avro.AvroSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
TABLEPROPERTIES ('avro.schema.url'='hdfs:///user/file.avsc');
When I add the LOCATION line to point to the actual avro file loctaion, I get a permission error
CREATE EXTERNAL TABLE tableName
PARTITIONED BY (ingestiondatetime BIGINT, recordtype STRING)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serd2.avro.AvroSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION '/local/avro_dir/'
TABLEPROPERTIES ('avro.schema.url'='hdfs:///user/file.avsc');
The error is
> FAILED: Error in metadata.....
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hive, access=WRITE, inode="/" hdfs:supergroup:drwxr-xr-x
I am running hive as me. My permissions on hdfs:/ are wide open (777). Where is hive trying to write to that it thinks it does not have permission?
Upvotes: 2
Views: 7337
Reputation: 611
your current user is hive which dont have write permission for inode="/"
before starting hive from command prompt try to change ur current user to hdfs by writing
export HADOOP_USER_NAME=hdfs
Upvotes: 0
Reputation: 780
TABLEPROPERTIES should be TBLPROPERTIES
Also for those using an API, the ";" is not allowed, at least with my version of Spark and Hive, but you know how fast this changes.
Upvotes: 0
Reputation: 1276
It is trying to do this as the hive user.
Here is the error & what It is trying to tell you
Permission denied: user=hive, access=WRITE, inode="/" hdfs:supergroup:drwxr-xr-x
The exception tells you: who it is doing it: user=hive what the user is trying to do: access=WRITE what dire is being accessed: inode="/" the permissions of the location: hdfs:supergroup:drwxr-xr-x
run hadoop fs -ls -la /
The permissions you describe (777) are not reflected in the error message. the permissions in the error message suggest that everyone has read & execute, but not write.
As Amar states, seeing the full command would be helpful!
If this were me, it would be because I had a typo in the location, and hive was trying to create the directory.
if you run
hadoop fs -ls [your LOCATION property ]
.. does the location exist?
Best of luck!
Brian
Upvotes: 2