user1723834
user1723834

Reputation: 71

How to upload file to HDFS in Ubuntu

I'm new to hadoop, Using single node hadoop 1.2.1 in ubuntu 14.04 LTS. I want to upload a file to hadoop for execution but i dont know how to use copyFromLocal command.. plz tell me how to upload a file my source file path "/home/saurabh/downloads/examples.jar" and my hadoop is at /usr/local/hadoop/

Upvotes: 5

Views: 22053

Answers (2)

Danyal Sandeelo
Danyal Sandeelo

Reputation: 12401

 hadoop fs -put /path/to/file.ext  /usr/local/hadoop/

Put command transfers the file from your local system to hdfs. In the line written below, the file.ext will be moved to /usr/local/hadoop folder and you can view it by running hadoop fs -ls /usr/local/hadoop

Upvotes: 2

Ashrith
Ashrith

Reputation: 6855

If your hadoop is in PATH then you could do this:

hadoop fs -put /home/saurabh/downloads/examples.jar /path/in/hdfs

If your hadoop is not in PATH, then you should either export HADOOP_HOME or you could also cd into the folder where you have installed hadoop and then execute the command.

you can also do the same with:

hadoop fs -copyFromLocal /home/saurabh/downloads/examples.jar /path/in/hdfs

where /path/in/hdfs is where you want to copy the file in HDFS to, assume if you want to copy the file to /user/saurabh in HDFS then the command would be something like this:

hadoop fs -put /home/saurabh/downloads/examples.jar /user/saurabh

Upvotes: 11

Related Questions