ashwini
ashwini

Reputation: 531

hadoop fs -get not working in ubuntu

I have created single node hadoop cluster in Ubuntu .

I was trying to copy file from hdfs to local fs but when i issued command

hduser@ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/ 

I got a message

get: No such file or directory

How to fix this?

Upvotes: 0

Views: 287

Answers (2)

A J
A J

Reputation: 492

The general format for hadoop shell command get shown below,

hadoop fs -get <HDFS File> <local File Directory> 

You have used as, hduser@ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/ in here /user/hduser/Employee is an directory not a file.

you should do as,

hduser@ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee[/FILENAME] /home/output/

Else If you want to copy a directory(i.e folder) you can use dfs -copyToLocal,

hduser@ubuntu:/usr/local/hadoop/bin$ hadoop dfs -copyToLocal /user/hduser/Employee /home/output/

You can find Hadoop Shell Commands here.

Upvotes: 2

Abhishek
Abhishek

Reputation: 143

You need to make sure that /user/hduser is a directory and not a file.I once had this problem and I tried hadoop fs -ls which showed -rwx r-x -r-x

A directory would be drwx r-x r-x . If this is the problem you need to remove it using -rmr /user/hduser and make it again with -mkdir.

Other options,try -copyToLocal or try downloading the file from HDFS webportal i.e. namenode_IP:50070

Upvotes: 0

Related Questions