lamiaheart
lamiaheart

Reputation: 29

Can't put file from local directory to HDFS

I have created a file with name "file.txt" in the local directory , now I want to put it in HDFS by using :-

]$ hadoop fs -put file.txt abcd

I am getting a response like

put: 'abcd': no such file or directory 

I have never worked on Linux. Please help me out - How do I put the file "file.txt" into HDFS?

Upvotes: 2

Views: 3048

Answers (1)

RojoSam
RojoSam

Reputation: 1496

If you don't specify an absolute path in hadoop (HDFS or wathever other file system used), it will pre-append your user directory to create an absloute path.

By default, in HDFS you default folder should be /user/user name.

Then in your case you are trying to create the file /user/<user name>/abcd and put inside it the content of your local file.txt.

The user name is your operative system user, in your local machine. You can get it using the whoami command.

The the problem is that your user folder doesn't exist in HDFS, and you need to create it.

BTW, according with hadoop documentation, the correct command to work with HDFS is hdfs dfs instead hadoop fs (https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HDFSCommands.html). But by now both should work.

Then:

  1. If you don't know your user name in your local operative system. Open a terminal and run the whoami command.
  2. Execute the follow command, replacing your user name.

hdfs dfs -mkdir -p /user/<user name>

And then you should be able to execute your PUT command.

NOTE: The -p parameter is to create the /user folder if it doesn't exist.

Upvotes: 2

Related Questions