mj_
mj_

Reputation: 6447

Hadoop ClassNotFoundException

I'm writing my 1st Hadoop application and I'm getting an error. I don't quite understand what some of the detials in this stack trace mean. It's a ClassNotFoundException. I'm building this on Ubuntu Linux v12.10, Eclipse 3.8.0, Java 1.6.0_24. I installed Hadoop by downloading it off the Apache site and building it with Ant.

My crash is on the 1st line of the program when I'm creating a job.

public static void main(String[] args) throws IOException, InterruptedException,  ClassNotFoundException {

    Job job = new Job(); <<== crashing here.


Program [Java Application]  
com.sandbox.hadoop.Program at localhost:33878   
    Thread [main] (Suspended (exception ClassNotFoundException))    
        owns: Launcher$AppClassLoader  (id=29)  
        owns: Class<T> (org.apache.hadoop.security.UserGroupInformation) (id=25)    
        URLClassLoader$1.run() line: 217    
        AccessController.doPrivileged(PrivilegedExceptionAction<T>, AccessControlContext) line: not available [native method]   
        Launcher$AppClassLoader(URLClassLoader).findClass(String) line: 205 
        Launcher$AppClassLoader(ClassLoader).loadClass(String, boolean) line: 321   
        Launcher$AppClassLoader.loadClass(String, boolean) line: 294    
        Launcher$AppClassLoader(ClassLoader).loadClass(String) line: 266    
        DefaultMetricsSystem.<init>() line: 37  
        DefaultMetricsSystem.<clinit>() line: 34    
        UgiInstrumentation.create(Configuration) line: 51   
        UserGroupInformation.initialize(Configuration) line: 216    
        UserGroupInformation.ensureInitialized() line: 184  
        UserGroupInformation.isSecurityEnabled() line: 236  
        KerberosName.<clinit>() line: 79    
        UserGroupInformation.initialize(Configuration) line: 209    
        UserGroupInformation.ensureInitialized() line: 184  
        UserGroupInformation.isSecurityEnabled() line: 236  
        UserGroupInformation.getLoginUser() line: 477   
        UserGroupInformation.getCurrentUser() line: 463 
        Job(JobContext).<init>(Configuration, JobID) line: 80   
        Job.<init>(Configuration) line: 50  
        Job.<init>() line: 46   
        Program.main(String[]) line: 17 
/usr/lib/jvm/java-6-openjdk-amd64/bin/java (Jan 14, 2013 2:42:36 PM)    

Console Output:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:216)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:477)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:80)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:50)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:46)
at com.sandbox.hadoop.Program.main(Program.java:18)

Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 16 more

Upvotes: 5

Views: 25026

Answers (3)

evaliotiri
evaliotiri

Reputation: 33

I was facing the same problem.I solved it by adding commons-configuration-x.x.jar to my build path.It is under $HADOOP_HOME/lib.

Upvotes: 2

Chris White
Chris White

Reputation: 30089

Does your main program need org.apache.commons.configuration.Configuration or should this be org.apache.hadoop.conf.Configuration?

Looks like Eclipse has auto-imported the wrong Configuration class, which isn't on the classpath when hadoop runs on your cluster.

Can you share your source code, in particular the com.sandbox.hadoop.Program class, main method?

Upvotes: 2

Charles Menguy
Charles Menguy

Reputation: 41458

You should add all the jars found in /usr/lib/hadoop-0.xx/lib to avoid this kind of classpath issues.

To give you an idea, you can type hadoop classpath which will print you the class path needed to get the Hadoop jar and the required libraries.

In your case, you're missing the hadoop-common-0.xx.jar, so you should add this to the classpath and you should be good to go.

Upvotes: 10

Related Questions