Reputation: 13
I got permission denied failure from hdfs while running the command below:
hive -e "insert overwrite directory '/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;"
The error message is:
Permission denied: user=hadoop, access=WRITE, inode="/user/hadoop/a/b":hdfs:hive:drwxrwxr-x
But when I run : hadoop fs -ls /user/hadoop/a
, I get:
drwxrwxrwx - hadoop supergroup 0 2014-04-08 00:56 /user/hadoop/a/b
It seems I have opened full permission on the folder b, why did I still get permission denied?
PS: I have set hive.insert.into.multilevel.dirs=true
in hive config file.
Upvotes: 1
Views: 5186
Reputation: 1053
The issue is not actually with the directory permissions. Hive should have access to the path, what I mean by that is not on the files level.
Below are the steps on how you can grant access to the hdfs path and to the database to a user/group. Comments on each command starts with #
#Login as hive superuser to perform the below steps
create role <role_name_x>;
#For granting to database
grant all on database to role <role_name_x>;
#For granting to HDFS path
grant all on URI '/hdfs/path' to role <role_name_x>;
#Granting the role to the user you will use to run the hive job
grant role <role_name_x> to group <your_user_name>;
#After you perform the below steps you can validate with the below commands
#grant role should show the URI or database access when you run the grant role check on the role name as below
show grant role <role_name_x>;
#Now to validate if the user has access to the role
show role grant group <your_user_name>;
Here is one of my answer to the similar question through impala. More on hive permissions
Other suggestion based on other answers and comments here, If you want to see the permissions on some hdfs path or file hdfs dfs -ls
is not your friend to know more about the permissions and its old school approach. you can use hdfs dfs -getfacl /hdfs/path
will give you the complete details, result looks something like below.
hdfs dfs -getfacl /tmp/
# file: /tmp
# owner: hdfs
# group: supergroup
# flags: --t
user::rwx
group::rwx
other::rwx
Upvotes: 0
Reputation: 21878
I had the same problem and I have solved it simply by using the fully qualified HDFS path. Like this.
hive -e "insert overwrite directory 'hdfs://<cluster>/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;"
See here a mention of this issue.
However, I do not know the root cause but it's not related to permissions.
Upvotes: 1
Reputation: 2876
Open a new terminal then try this:
1.) Change user to root:
su
2.) Change user to hdfs:
su hdfs
3.) Then run this command:
hadoop fs -chown -R hadoop /user/hadoop/a
Now you can try the command you were running.
Hope it helps...!!!
Upvotes: 0