Praveen Sripati
Praveen Sripati

Reputation: 33495

Why do we need to set the output key/value class explicitly in the Hadoop program?

In the "Hadoop : The Definitive Guide" book, there is a sample program with the below code.

JobConf conf = new JobConf(MaxTemperature.class);  
conf.setJobName("Max temperature");  
FileInputFormat.addInputPath(conf, new Path(args[0]));  
FileOutputFormat.setOutputPath(conf, new Path(args[1]));  
conf.setMapperClass(MaxTemperatureMapper.class);  
conf.setReducerClass(MaxTemperatureReducer.class);  
conf.setOutputKeyClass(Text.class);  
conf.setOutputValueClass(IntWritable.class);  

The MR framework should be able to figure out the output key and value class from the Mapper and the Reduce functions which are being set on the JobConf class. Why do we need to explicitly set the output key and value class on the JobConf class? Also, there is no similar API for the input key/value pair.

Upvotes: 6

Views: 3920

Answers (1)

Thomas Jungblut
Thomas Jungblut

Reputation: 20969

The reason is type erasure[1]. You set the output K/V classes as generics. During job setup (which is run time, not compile time), these generics are erased.

The input k/v classes can be read from the input file, in the case of SequenceFiles the classes are in the header- you can read them when opening a sequence file in the editor. This header must be written, since every map output is a SequenceFile, so you need to provide the classes.

[1] http://download.oracle.com/javase/tutorial/java/generics/erasure.html

Upvotes: 8

Related Questions