Reputation: 2316
I have a local VM
that has Hortonworks
Hadoop and hdfs
installed on it. I ssh'ed
into the VM from my machine and now I am trying to copy a file from my local filesystem into hdfs through following set of commands:
[root@sandbox ~]# sudo -u hdfs hadoop fs -mkdir /folder1/
[root@sandbox ~]# sudo -u hdfs hadoop fs -copyFromLocal /root/folder1/file1.txt /hdfs_folder1/
When I execute it I get following error as - copyFromLocal:/root/folder1/file1.txt': No such file or directory
I can see that file right in /root/folder1/
directory but with hdfs
command its throwing above error. I also tried to cd
to /root/folder1/
and then execute the command but same error comes. Why is the file not getting found when it is right there?
Upvotes: 4
Views: 15737
Reputation: 1
I had the same problem running a Hortonworks 4 node cluster. As mentioned, user "hdfs" doesn't have permission to the root directory. The solution is to copy the information from the root folder to something the "hdfs" user can access. In the standard Hortonworks installation this is /home/hdfs
as root run the following...
mkdir /home/hdfs/folder1
cp /root/folder1/file1.txt /home/hdfs/folder1
now change users to hdfs and run from the hdfs USER's accessible directory
su hdfs
cd /home/hdfs/folder1
now you can access files as the hdfs user
hdfs dfs -put file1.txt /hdfs_folder1
Upvotes: 0
Reputation: 3956
By running sudo -u hdfs hadoop fs...
, it tries to read the file /root/folder1/file.txt as hdfs.
You can do this.
chmod 755 -R /root
. It will change permissions on directory and file recursively. But it is not recommended to open up permission on root home directory.sudo -u hdfs
to copy file from local file system to hdfs.Better practice is to create user space for root and copy files directly as root.
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root:root /user/root
Upvotes: 2