trallallalloo
trallallalloo

Reputation: 602

nosuchmethod error on hadoop java

Here is my run method

Configuration conf = new Configuration();

    MongoConfigUtil.setOutputURI(conf, "mongodb://localhost/test/sensors");

    System.out.println("Conf : " + conf);

    conf.set("fs.defaultFS", "hdfs://localhost:9000");
    @SuppressWarnings("deprecation")
    Job job = new Job(conf, "sensor");

    job.setJarByClass(HdfsToMongo.class);

    job.setMapperClass(TokenzierMapper.class);

    job.setCombinerClass(IntSumReducer.class);
    job.setReducerClass(IntSumReducer.class);

    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(IntWritable.class);

    job.setInputFormatClass(TextInputFormat.class);
    job.setInputFormatClass(TextInputFormat.class);

    job.setOutputFormatClass(TextOutputFormat.class);
    FileInputFormat.setInputPaths(job, new Path("In"));

    job.setOutputFormatClass(MongoOutputFormat.class);

    return job.waitForCompletion(true) ? 0 : 1;

I have a hadoop job which connects mongodb and gets data to hdfs. When I run the code I get this exception

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.LocalJobRunner.<init>(Lorg/apache/hadoop/conf/Configuration;)V
at org.apache.hadoop.mapred.LocalClientProtocolProvider.create(LocalClientProtocolProvider.java:42)
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1266)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1262)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1261)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1290)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
at tr.com.vedat.hadoop.HdfsToMongo.run(HdfsToMongo.java:86)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at tr.com.vedat.hadoop.HdfsToMongo.main(HdfsToMongo.java:90)

in HdfsToMongo class, errors occures at job.waitForCompletion(true) ? 0 : 1; line. So, can anyone show me my faults? Did I forget something in the code?

Upvotes: 1

Views: 1787

Answers (1)

vanekjar
vanekjar

Reputation: 2406

Be careful about classes your are importing. You need to use a correct package in imports.

You are probably importing the old API from package org.apache.hadoop.mapred.

But you should always import classes from package org.apache.hadoop.mapreduce.

Upvotes: 1

Related Questions