Ku002
Ku002

Reputation: 117

Unable to change read write permissions to hdfs directory

I am trying to copy text file into hdfs location.
I'm facing Access issue, so I tried changing permissions.
But I'm unable to change the same facing below error:

chaithu@localhost:~$ hadoop fs -put test.txt /user
put: Permission denied: user=chaithu, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

chaithu@localhost:~$ hadoop fs -chmod 777 /user
chmod: changing permissions of '/user': Permission denied. user=chaithu is not the owner of inode=user

chaithu@localhost:~$ hadoop fs -ls /
Found 2 items
drwxrwxrwt   - hdfs supergroup          0 2017-12-20 00:23 /tmp
drwxr-xr-x   - hdfs supergroup          0 2017-12-20 10:24 /user

Kindly help me how can I change the rights to full read and write for all users to access the HDFS folder.

Upvotes: 4

Views: 20449

Answers (2)

OneCricketeer
OneCricketeer

Reputation: 192023

First off, you shouldn't be writing into the /user folder directly nor set 777 on it

You're going to need a user directory for your current user to even run a mapreduce job, so you need to sudo su - hdfs first to become an HDFS superuser.

Then run these to create HDFS directories for your user account

 hdfs dfs -mkdir -p /user/chaithu
 hdfs dfs -chown -R chaithu /user/chaithu
 hdfs dfs -chmod -R 770 /user/chaithu

Then exit from the hdfs user, and chaithu can now write to its own HDFS directory.

hadoop fs -put test.txt

That alone will put the file in the current user's folder.


Or, if that's too much work for you write to /tmp instead


A lazy option is to rewrite your user account to the super user.

export HADOOP_USER_NAME=hdfs 
hadoop fs -put test.txt /user

And this is why hadoop is not secure or enforce user account access by default (i.e. never do this in production)


And finally, you can always just turn permissions completely off in hdfs-site.xml (again, only useful in development phases)

 <property>
    <name>dfs.permissions</name>
    <value>false</value>
  </property>

Upvotes: 7

roh
roh

Reputation: 1053

If you observe your hdfs dfs -ls result you see that only HDFS super user have the permissions to that path.

you have two solutions here

One is to change the permissions to chaitu through root user and making chaitu as user or owner, something like this hdfs dfs -chown -R hdfs:chaitu /path then you will be able to get access to that being a owner. Other dirty way is to give hdfs dfs -chmod -R 777 /path from the root, from the security stand point this 777 is not good.

Second one is using ACLS which gives you the temporary access

Please go through this link for more understanding.

More on ACLS

This is so basic and important for you to learn, try the above suggested ones and let me know if those don’t work I can help more based on the error you get.

Upvotes: 4

Related Questions