aa8y
aa8y

Reputation: 3942

Handling Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

I am using CDH4 and have written a MapReduce application using the new mapreduce API. I have compiled it against hadoop-core-1.0.3.jar and when I run it on my Hadoop cluster I get the error:

Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

I referred to this StackOverflow question which seems to be talking about the same problem. The answer suggests that we compile out code against the Hadoop-core-2.X.jar file, but I am unable to find anything like that.

So how do I compile it so that it runs flawlessly in CDH4.

Upvotes: 3

Views: 15700

Answers (2)

Jeff Miller
Jeff Miller

Reputation: 588

For me I found that I was using the wrong version of Hadoop with Avro 1.7.4 (http://www.cloudera.com/content/cloudera/en/documentation/cdh4/v4-2-0/CDH4-Release-Notes/cdh4ki_topic_2_9.html)

Upvotes: 0

aa8y
aa8y

Reputation: 3942

The answer in the link I posted in the question above asked to compile against Hadoop 2.0 library. Incidentally the post Hadoop 1.0, instead of using one single Hadoop Core jar for compilation, two (or maybe more) different jars are to be used.

I used: hadoop-common-2.0.2-alpha.jar hadoop-mapreduce-client-core-2.0.2-alpha.jar

for compiling my code and after that it ran fine w/o giving the aforementioned error.

Upvotes: 8

Related Questions