Reputation: 1
I am trying to run the example AggregateWordCount but I get this error, when I use this code to run Wordcount everything works fine.
!/bin/bash
# test the hadoop cluster by running wordcount
# create input files
mkdir input
echo "Hello World" >input/file2.txt
echo "Hello Hadoop" >input/file1.txt
# create input directory on HDFS
hadoop fs -mkdir -p input
put input files to HDFS
hdfs dfs -put ./input/* input
# run wordcount
#hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/sources/hadoop-mapreduce-examples-2.7.7-sources.jar org.apache.hadoop.examples.WordCount input output
hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/sources/hadoop-mapreduce-examples-2.7.7-sources.jar org.apache.hadoop.examples.AggregateWordCount input output
# print the input files
echo -e "\ninput file1.txt:"
hdfs dfs -cat input/file1.txt
echo -e "\ninput file2.txt:"
hdfs dfs -cat input/file2.txt
# print the output of wordcount
echo -e "\nwordcount output:"
hdfs dfs -cat output/part-r-00000
Upvotes: 0
Views: 1168
Reputation: 29185
Do you a recursive find on hdfs with the below command where you wrote the file with
output/part-r-00000
hadoop fs -ls -R /user/your_directory grep -i "output/part-r-00000"
should recursively list directories .
Adjust your code or script to point to that.
Upvotes: 0