Anil Ekambram
Anil Ekambram

Reputation: 36

Map-reduce JobConf - Error in adding FileInputFormat

I have created a Mapper using the syntax:

public class xyz extends MapReduceBase implements Mapper<LongWritable, Text, Text, Text>{
    -----
    public void map(LongWritable key, Text value,
        OutputCollector<Text, Text> output, Reporter reporter)
    --
}

In the job, I created a Job object:

Job job = new Job(getConf());

To this job, I am not able to add Mapper class using:

job.setMapper(xyz);

error message:

The method setMapperClass(Class<? extends Mapper>) in the type Job is not applicable for the arguments (Class<InvertedIndMap1>)

I cannot use a map with extends Mapper as I am using outputCollectorand Reporter in the mapper.

In the job, if I use JobConf instead of job like:

JobConf conf = new JobConf(getConf());

then conf.setMapper(xyz) is working.

But not able to set input paths using:

FileInputFormat.addInputPaths(conf,new Path(args[0]));

Error message:

The method addInputPaths(Job, String) in the type FileInputFormat is not applicable for the arguments (JobConf, Path)

I tried setInputPaths, setInputpath, addInputPath. But same error again. Same error occurs for addOutputPath/SetOuputpath.

Please suggest a solution for this issue.

Upvotes: 0

Views: 2527

Answers (2)

Roopak T J
Roopak T J

Reputation: 1

You are basically mixing the two imports, the mapred(older) and mapreduce(new).Try to include only one and replace all the older with the new mapreduce class

Upvotes: 0

Aleksei Shestakov
Aleksei Shestakov

Reputation: 2538

I think the problem is that you have imported not suitable FileInputFormat. I guess that you need to replace

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

with

import org.apache.hadoop.mapred.FileInputFormat;

Upvotes: 5

Related Questions