Amit Sharma
Amit Sharma

Reputation: 39

Unable to read an existing file - No such file or directory

each time I use -put command to copy a local file to hdfs

$ hadoop fs -put file:///root/t1/t11 hdfs:///user/amit

it gives me the following error

put: '/root/t1/t11': No such file or directory

I am sure the file exists. I changed the permissions but still not going through

Please help me

Upvotes: 3

Views: 1715

Answers (5)

Amit Sharma
Amit Sharma

Reputation: 39

The following combination of commands worked.

[root@amit ~]# mkdir /testing
[root@amit ~]# cp /root/t1/t11 /testing
[root@amit ~]# sudo chown hdfs:hadoop /testing
[root@amit ~]# sudo chown hdfs:hadoop /testing/t11
[root@amit ~]# sudo -u hdfs hadoop fs -put /testing/t11 hdfs:///user/amit

hdfs is my hadoop username

hadoop is my hadoop groupname

I cannot vote up yet as I am new but thank you for the support.

Upvotes: 0

Ashrith
Ashrith

Reputation: 6855

I see that you are copying the files from root user linux home directory ('/root') to amit user's hdfs home directory (/user/amit) as hadoop user. So, one thing to keep in mind is that root user's home directory is not readable to other users.

To make this to work you need to have to copy the file to linux /tmp folder (which is universally readable) as the hadoop user cannot read the file from root user's home directory in linux file system (this required you to have sudo access):

sudo cp /root/t1/t11 /tmp
hadoop fs -put /tmp/t11 /user/amit

Upvotes: 0

Rajesh N
Rajesh N

Reputation: 2574

Try using put command from different folder other than /root. Do this in terminal (as hadoop user):

sudo mkdir -p /usr/local/test
sudo cp /root/t1/t11 /usr/local/test/t11
sudo chown hadoop:hadoop /usr/local/test/t11
hadoop fs -mkdir -p /user/amit
hadoop fs -put /usr/local/test/t11 /user/amit/t11

In sudo chown hadoop:hadoop /usr/local/test/t11, hadoop:hadoop is your hadoop user name and hadoop group name respectively.

Upvotes: 1

Rijul
Rijul

Reputation: 1445

hope these helps:

$ hadoop fs -put localhost:///root/t1/t11 hdfs:///user/amit

or

$ hadoop fs -put /root/t1/t11 hdfs:///user/amit

or

$ hadoop fs -put /root/t1/t11 hdfs://your_hadoop_host_name/user/amit

in my case all these works.

and test using hadoop user which you have created.

Upvotes: 1

Yuliia Ashomok
Yuliia Ashomok

Reputation: 8598

You can use hdfs dfs -copyFromLocal command. See how to do it:

1. Creating input folder in DFS:

hdfs dfs -mkdir /input

2. Copy the directory from local disk to the newly created 'input' directory in HDFS:

hdfs dfs -copyFromLocal /usr/local/test/t11 /input

Upvotes: 0

Related Questions