AkshayMane
AkshayMane

Reputation: 45

JPS gives empty output, None of the Hadoop Daemons start with start-all.sh, Hadoop Psudo Distributed Mode, on 32 bits VM running on 64 bit Windows OS

I'm trying to set up Hadoop2.7.1 , Java OpenJDK 7, on 32 Bit VM running on top of 64 bit OS. I have configured all the files as mentioned here http://pingax.com/install-hadoop2-6-0-on-ubuntu/

Even after i run start-dfs.sh or start-all.sh None of the daemons are started.

Here's the output of start,jps command

hduser@ubuntu:~$ start-dfs.sh
16/04/22 00:33:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-ubuntu.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-ubuntu.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-ubuntu.out
16/04/22 00:33:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hduser@ubuntu:~$  jps
12147 Jps
hduser@ubuntu:~$ 

I cant seem to understand the cause. As far as the Warning is concerned as pointed out in few other answers it can be ignored or suppressed.

I further saw the contents of debug file mentioned which reads like this

hduser@ubuntu:~$ cat /usr/local/hadoop/logs/hadoop-hduser-namenode-ubuntu.out
OpenJDK Client VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 14869
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
hduser@ubuntu:~$ 

Even the same error is given in the Log file for Data Node. Any help will be appreciated...


EDIT:

https://chawlasumit.wordpress.com/2014/06/17/hadoop-java-hotspottm-execstack-warning/

As suggested I made changes, But this only suppressed the error in the Log,

hduser@ubuntu:~$ cat /usr/local/hadoop/logs/hadoop-hduser-namenode-ubuntu.out
ulimit -a for user hduser
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 14869
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 14869
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

Upvotes: 1

Views: 1009

Answers (2)

AkshayMane
AkshayMane

Reputation: 45

This problem is occuring because of 1. You are using a Virtual Machine 2. 32 bit VM on top of 64bit Host 3. default native library in hadoop is built for 32-bit.

Here's one plausible solution which though didnt work for me hadoop 2.2.0 64-bit installing but cannot start

However when i configured hadoop using these steps(http://pingax.com/install-hadoop2-6-0-on-ubuntu/) directly on a Ubuntu machine without using virtual machine , It worked correctly :)

so if you are facing this then, try running on ubuntu physical machine.

Upvotes: 0

mkg90
mkg90

Reputation: 33

Check your configuration files. Make sure that the content of your .xml files(especially your core-site.xml) is the same as here. Few websites have outdated tutorials, they mention "fs.default.name" instead of "fs.defaultFS" in the core-site.xml file.

Upvotes: 1

Related Questions