TamizhK
TamizhK

Reputation: 448

Getting error in mapreduce without mapper

I tried to use KeyValueInputFormat in the student marks example. This is the input:

s1 10
s2 50
s3 30
s1 100
s1 50
s2 30
s3 70
s3 50
s2 75

I used the KeyValueInputFormat as the input format, so it brings student names(s1,s2...) as the keys and marks(10,50...) as the values. My aim is to find the total marks for each one. So, I used just the reducer as

public class MarkReducer extends Reducer<Text,Text,Text,LongWritable>{

  public void reduce(Text key, Iterable<Text> values,Context ctx) throws IOException, InterruptedException{
    long sum = 0;
    for(Text value:values){
        sum+= Long.parseLong(value.toString());
    }
    ctx.write(key, new LongWritable(sum));
  }
}

I neither created nor mention the mapper in the job. I am getting the error

Error: java.io.IOException: Type mismatch in value from map: expected org.apache.hadoop.io.LongWritable, received org.apache.hadoop.io.Text

But If I use a dummy mapper like,

  public class MarkMapper extends Mapper<Text,Text,Text,Text> {
    public void map(Text key, Text value, Context ctx) throws IOException, InterruptedException{
        ctx.write(key, value);
    }
}

I am able to get the proper output. Can someone please help me out?

Upvotes: 1

Views: 366

Answers (2)

vefthym
vefthym

Reputation: 7462

The problem is that you have stated in your main method that the output of the program will be keys of type Text and values of type LongWritable. By default, this is also assumed to be the output type of the Mapper.

Also, the default mapper (the IdentityMapper) will also assume that the input it receives is also of the same type as its output, but in your case the input and output of the mapper should be key-value pairs of the type Text.

So, just add a command in the main method, to specify that the output of the Mapper should be of the form Text,Text:

job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);

I think this should work. Otherwise, do as Mobin Ranjbar suggests.

Upvotes: 1

Mobin Ranjbar
Mobin Ranjbar

Reputation: 1360

Change this line:

ctx.write(key, new LongWritable(sum));

to

ctx.write(key, new Text(sum));

in your reducer. Or change reduce(Text key, Iterable<Text> values,Context ctx) to reduce(Text key, Iterable<LongWritable> values,Context ctx)

Upvotes: 1

Related Questions