AkD
AkD

Reputation: 437

Hbase Dependency Issue: NoClassDefFoundError RegionCoprocessorHost

I am trying to resolve the following error:

13/05/05 19:49:04 INFO handler.OpenRegionHandler: Opening of region {NAME => '-ROOT-,,0', STARTKEY => '', ENDKEY => '', ENCODED => 70236052,} failed, marking as FAILED_OPEN in ZK
13/05/05 19:49:04 INFO regionserver.HRegionServer: Received request to open region: -ROOT-,,0.70236052
13/05/05 19:49:04 INFO regionserver.HRegion: Setting up tabledescriptor config now ...
13/05/05 19:49:04 ERROR handler.OpenRegionHandler: Failed open of region=-ROOT-,,0.70236052, starting to roll back the global memstore size.
java.lang.IllegalStateException: Could not instantiate a region instance.
    at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3747)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:3927)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:332)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:108)
    at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:175)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.GeneratedConstructorAccessor17.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3744)
    ... 7 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost
    at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:421)
    ... 11 more

I have the following Maven Dependency:

<properties>
<hadoopCDHMRVersion>2.0.0-mr1-cdh4.2.0</hadoopCDHMRVersion&gt;
<hadoopCDHVersion>2.0.0-cdh4.2.0</hadoopCDHVersion&gt;
<hbaseCDHVersion><b>0.94.2-cdh4.2.0</b></hbaseCDHVersion>
</properties>

<dependencyManagement>
<dependencies>
<!-- Apache -->
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <exclusions>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-compiler</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-runtime</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoopCDHVersion}</version>
        <exclusions>
          <exclusion>
            <groupId>org.mockito</groupId>
            <artifactId>mockito-all</artifactId>
          </exclusion>
          <exclusion>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
          </exclusion>
          <exclusion>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-compiler</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-runtime</artifactId>
          </exclusion>
          <exclusion>
            <groupId>org.mortbay.jetty</groupId>
            <artifactId>jetty</artifactId>
          </exclusion>
          <exclusion>
            <groupId>org.mortbay.jetty</groupId>
            <artifactId>jetty-util</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoopCDHVersion}</version>
      </dependency>
      <!-- Test -->
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase</artifactId>
        <scope>test</scope>
        <classifier>tests</classifier>
        <version>${hbaseCDHVersion}</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase</artifactId>
        <scope>provided</scope>
        <version>${hbaseCDHVersion}</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-test</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <scope>test</scope>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-minicluster</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <scope>test</scope>
      </dependency>
<dependencies>
</dependencyManagement>

I am bringing the dependency from parent pom to child pom. The code that I test against:

//Started a mini cluster to perform unit test
final Configuration startingConf = HBaseConfiguration.create();
startingConf.setLong("hbase.client.keyvalue.maxsize", 65536);
startingConf.setStrings(HConstants.ZOOKEEPER_QUORUM, "localhost");
startingConf.setStrings("mapreduce.jobtracker.address", "local");
startingConf.setLong(HConstants.HBASE_CLIENT_PAUSE, 50);
startingConf.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER, 200);
testUtil = new HBaseTestingUtility(startingConf);
//point of failure 
testUtil.startMiniCluster();


I get the error at after startMiniCluster() It does most of the work of instantiating environment, but drop in between due to above error. Things I tried:

  1. If I roll back from hbaseCDHVersion 0.94.2-cdh4.2.0 to any version of 0.92.1-cdh4.X.X, it works.
  2. Removed .m2 cache completely and see to it that only 0.94.2-cdh4.2.0 is created.
  3. Tried almost all versions of 0.94.2-cdh4.X.X
  4. I run mvn clean and install via commandline and not relying on eclipse to do magic, aslo tried eclipse:eclipse.
  5. Check the type/resource through eclipse for the missing class and it points to the correct version of the local repo, so I can find it via eclipse.
  6. Observe dependency tree for any conflicts.
  7. I also opened the repo jar by myself and saw the class exists.
  8. Tried creating a new project and create pom file from scratch.

Any pointers are greatly appreciated.

Upvotes: 1

Views: 2531

Answers (1)

AkD
AkD

Reputation: 437

The problem was with commons-configuration jar. The parent pom was bringing in 1.9 version, leading to conflicts with hadoop common jar which was bringing in 1.6 version. The only way to figure out the problem was keeping minmum dependency in parent pom and uncommenting dependencies one by one to narrow down the problem. Once problem was found simply exclude those dependencies in the hadoop commons dependency. Hope this helps someone. The hadoop jar should upgrade there commons-configuration which is five years old right now. We can also roll back the latest jar from 1.9 to 1.6

Upvotes: 1

Related Questions