user3677291
user3677291

Reputation: 91

Unable to copy files from local disk to HDFS

i have successfully installed ubuntu 12.04 and hadoop 2.4.0.

after entering the jps command i find the output as below

4135 jps
2582 SeconadaryNameNode
3143 NodeManager
2394 Namenode
2391 Datanode
3021 ResourceManager

now i want to run the wordcount example.

i created a .txt file with some content in it

now whenever i try to copy this into hadoop hdfs by following this command

hdfs -copyFromLocal /app/hadoop/tmp/input.txt /wordcount/input.txt

("wordcount" in the path is a directory which i have created)

but it shows

unrecognised option: -copyFromLocal
could not create the java virtual machine 

what i am doing wrong?

Upvotes: 2

Views: 7303

Answers (2)

aa8y
aa8y

Reputation: 3942

The commands you are using are older ones. Try,

hadoop fs -mkdir -p /wordcount/input
hadoop fs -put /app/hadoop/tmp/input.txt /wordcount/input/input.txt

You'll need to specify the output directory to be /wordcount/output in this case and it should not exist before you run the job. If it does, the job will fail. So you can remove the directory as,

hadoop fs -rm -R /wordcount/output

Edit: To see the output files, check:

hadoop fs -ls /wordcount/output

To see the output on the console, use this:

hadoop fs -cat /wordcount/output/part*

Edit 2: The newer Hadoop CLI uses:

hdfs dfs <your_command_here>

For example,

hdfs dfs -ls /

Also, if you want to read gzip files, you can use this,

hdfs dfs -cat /wordcount/output/part* | gzip -d -u

Upvotes: 9

samthebest
samthebest

Reputation: 31515

You forgot dfs

hdfs dfs -copyFromLocal /blar /blar

IMO Scalding is the best tool to get started writing MapReduce programs. It's as concise as Pig but as flexible as Java.

Upvotes: 0

Related Questions