subramanian
subramanian

Reputation: 1025

Hbase master failing to start exception Java.Lang.NoSuchMethodException

I'm following Lars George Hbase definitive guide. I'm in the process of setting up a pseudo distributed cluster on my Mountain Lion Macbook pro.I downloaded Hadoop archive 0.20.205.0, untarred and made a few minor changes

dfs.replication=1

and others. When I made the changes in hbase-site.xml to specify localhost hdfs namenode as in

hbase.rootdir=hdfs://localhost:9000/hbase

and a few other properties like

zookeeperquorum, zookeeperdataDir, isDistributed=true

and so forth.

However, on running

bin/start-hbase.sh

I was unable to access webui at 60010. On running, jps, I noticed the master of hbase was dying fast. So I accessed master log and found this exception thrown

2629 2013-06-23 14:22:43,694 WARN org.apache.hadoop.hbase.util.FSUtils: Unable to create version file at hdfs://localhost:9000/hbase, retrying: java.io.IOException: java.lang.NoSuchMethodException: org.apache.hadoop.hdfs.protocol.Client     Protocol.create(java.lang.String, org.apache.hadoop.fs.permission.FsPermission, java.lang.String, boolean, boolean, short, long)
2630     at java.lang.Class.getMethod(Class.java:1607)
2631     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
2632     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
2633     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
2634     at java.security.AccessController.doPrivileged(Native Method)
2635     at javax.security.auth.Subject.doAs(Subject.java:396)
2636     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
2637     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)

I don't want to repeat what everyone postfixes when they post questions like these, I'm a newbie please help :) But I really am a newbie did what I had to do, couldnt find the answer, any pointers will be greatly appreciated.

Upvotes: 0

Views: 1114

Answers (2)

Tariq
Tariq

Reputation: 34184

When you use HBase in pseudo(or fully) distributed mode it has got some dependencies on hadoop libraries(like RPC version might get changed because of a change in protocol). That's why you need proper Hadoop jars in your Hase/lib folder.

Because of these dependencies each version of HBase comes bundled with an instance of the Hadoop jars under its lib directory. It might be that the bundled Hadoop was made from some branch, at the time of your HBase's release, which is different from the one you are using at present. For example, you are using hbase-0.94.7 which is quite latest as compared to hadoop-0.20.205. That is why, it is critical that the version of Hadoop that is out on your cluster matches what your version of HBase needs.

Also, I would suggest you to use the latest stable release of both Hadoop and HBase in order to avoid these problems. 0.20.205 is quite ancient now.

Hope this answers your question.

Upvotes: 2

subramanian
subramanian

Reputation: 1025

Although, my problem is solved, copying the jar hadoop-core-0.20.205 to lib directory of hbase, It will be great if someone who is experienced in hbase could comment on this one. I'd like to hear an experienced answer

Upvotes: 0

Related Questions