mcduck
mcduck

Reputation: 33

HBase 0.98.5 fails to start on windows

I've downloaded and installed HBase 0.98.5 on a Windows 7 PC. I followed Apache's Getting Started steps and modified base-site.xml like below (configuration element was empty OOB):

<configuration>
    <property>
    <name>hbase.rootdir</name>
    <value>file:///c:/datastore/hbase</value>
  </property>
  <property>
    <name>hbase.zookeeper.property.dataDir</name>
    <value>c:/datastore/zookeper</value>
  </property>
</configuration>

When starting the HBase I'm getting the following error:

2014-08-13 14:37:26,827 DEBUG [main-EventThread] master.ActiveMasterManager: A master is now available
2014-08-13 14:37:26,828 WARN  [M:0;rzm01:57477] hbase.ZNodeClearer: Environment variable HBASE_ZNODE_FILE not set; znodes will not be cleared on crash by start scripts (Longer MTTR!)
2014-08-13 14:37:26,829 INFO  [M:0;rzm01:57477] master.ActiveMasterManager: Registered Active Master=rzm01.self,57477,1407929845469
2014-08-13 14:37:26,836 INFO  [M:0;rzm01:57477] Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
2014-08-13 14:37:26,909 FATAL [M:0;rzm01:57477] master.HMaster: Unhandled exception. Starting shutdown.
java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
        at org.apache.hadoop.util.Shell.run(Shell.java:379)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
        at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
        at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:651)
        at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:629)
        at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:587)
        at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:461)
        at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileS
ystem.java:152)
        at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
        at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:790)
        at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:603)
        at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:263)
        at java.lang.Thread.run(Thread.java:744)
2014-08-13 14:37:26,912 INFO  [M:0;rzm01:57477] master.HMaster: Aborting
2014-08-13 14:37:26,912 DEBUG [M:0;rzm01:57477] master.HMaster: Stopping service threads
2014-08-13 14:37:26,913 INFO  [M:0;rzm01:57477] ipc.RpcServer: Stopping server on 57477
2014-08-13 14:37:26,914 INFO  [M:0;rzm01:57477] master.HMaster: Stopping infoServer
2014-08-13 14:37:26,918 INFO  [RpcServer.listener,port=57477] ipc.RpcServer: RpcServer.listener,port=57477: stopping
2014-08-13 14:37:26,952 INFO  [main] regionserver.ShutdownHook: Installed shutdown hook thread: Shutdownhook:RS:0;rzm01:57505

By looking at ProcessBuilder#start() method where the exception is thrown it seems that some arguments are missing:

public Process start() throws IOException {
        // Must convert to array first -- a malicious user-supplied
        // list might try to circumvent the security check.
        String[] cmdarray = command.toArray(new String[command.size()]);
        cmdarray = cmdarray.clone();

        for (String arg : cmdarray)
            if (arg == null)
                throw new NullPointerException(); // this is line 1010
        // Throws IndexOutOfBoundsException if command is empty
        String prog = cmdarray[0];

Any ideas what could be wrong here?

Upvotes: 3

Views: 1182

Answers (1)

Veaceslav Dubenco
Veaceslav Dubenco

Reputation: 86

The problem is that the HBase distributive for Windows is missing some Hadoop files. You can download these files from the following blog: http://www.srccodes.com/p/article/39/error-util-shell-failed-locate-winutils-binary-hadoop-binary-path (Thanks to Abhijit Ghosh for building those files for Windows and sharing them - otherwise you would have to build them yourself).

So follow these steps to solve the problem:

  1. Download or build the Hadoop binaries and libraries for Windows;
  2. Copy those files (e.g. winutils.exe, winutils.dll, etc.) to %HBASE_HOME%\bin
  3. Edit the %HBASE_HOME%\conf\hbase-env.cmd: Find the line:

    set HBASE_OPTS="-XX:+UseConcMarkSweepGC" "-Djava.net.preferIPv4Stack=true"
    

    and add the following option at the end: "-Dhadoop.home.dir=c:\hbase-0.98.5-hadoop2" where c:\hbase-0.98.5-hadoop2 is your HBase Home directory, so that it will become as follows:

    set HBASE_OPTS="-XX:+UseConcMarkSweepGC" "-Djava.net.preferIPv4Stack=true" "-Dhadoop.home.dir=c:\hbase-0.98.5-hadoop2"`
    

Good luck

Upvotes: 6

Related Questions