Reputation: 1051
I cannot understand what exactly the cleanup method in Hadoop does, and how does it function? I have the following Map-Reduce code to calculate max, min and mean of a bunch of numbers.
public class Statistics
{
public static class Map extends Mapper<LongWritable, Text, Text, Text>
{
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
{
/* code to calculate min, max, and mean from among a bunch of numbers */
}
public void cleanup(Context context) throws IOException, InterruptedException
{
Text key_min = new Text();
key_min.set("min");
Text value_min = new Text();
value_min.set(String.valueOf(min));
context.write(key_min,value_min);
Text key_max = new Text();
key_max.set("max");
Text value_max = new Text();
value_max.set(String.valueOf(max));
context.write(key_max,value_max);
Text key_avg = new Text();
key_avg.set("avg");
Text value_avg = new Text();
value_avg.set(String.valueOf(linear_sum)+","+count);
context.write(key_avg,value_avg);
Text key_stddev = new Text();
key_stddev.set("stddev");
Text value_stddev = new Text();
value_stddev.set(String.valueOf(linear_sum)+","+count+","+String.valueOf(quadratic_sum));
context.write(key_stddev,value_stddev);
}
}
public static class Reduce extends Reducer<Text,Text,Text,Text>
{
public void reduce(Text key, Iterable<Text> values,Context context) throws IOException, InterruptedException
{
/* code to further find min, max and mean from among the outputs of different mappers */
}
}
public static void main(String[] args) throws Exception
{
/* driver program */
}
}
So what exactly is the cleanup(Context context)
method doing here? I am assuming it collects output (key, value) pairs from a bunch of mappers and passes it on to the reducer. On other sites I have read that the order things run in MapReduce is: setup -> map -> cleanup and then setup -> reduce -> cleanup. Why is this program not using a setup method?
Upvotes: 1
Views: 957
Reputation: 41
Such values must be calculated not in Mapper, it must be calculated on Reduce step. https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html#Reducer
Upvotes: 1