ayushman999
ayushman999

Reputation: 511

JAVA _Home is not set in Hadoop

I am a beginner with hadoop and trying to install and run hadoop in my Ubuntu as a single node cluster. This is my JAVA_HOME in my hadoop_env.sh

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386/
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}

But when I run it the following errors come-

Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.

How do I remove this error?

Upvotes: 10

Views: 33159

Answers (8)

Haha TTpro
Haha TTpro

Reputation: 5556

First, you must set JAVA_HOME in your hadoop_env.sh . (your local JAVA_HOME in .bashrc would likely to be ignore somehow)

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/default-java

Then, set HADOOP_CONF_DIR point to directory of your hadoop_env.sh . In ~/.bashrc, add the following line:

HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
export HADOOP_CONF_DIR

Where /usr/local/hadoop/etc/hadoop is the directory contained hadoop_env.sh

Upvotes: 2

mule.ear
mule.ear

Reputation: 66

I'm using hadoop 2.8.0. Even though I exported JAVA_HOME (I put it in .bashrc), I still caught this error while trying to run start-dfs.sh.

user@host:/opt/hadoop-2.8.0 $ echo $JAVA_HOME
<path_to_java>
user@host:/opt/hadoop-2.8.0 $ $JAVA_HOME/bin/java -version
java version "1.8.0_65"
...
user@host:/opt/hadoop-2.8.0 $ sbin/start-dfs.sh
...
Starting namenodes on []
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.

The only way I could get it to run was to add JAVA_HOME=path_to_java to etc/hadoop/hadoop-env.sh and then source it:

:/opt/hadoop-2.8.0 $ grep JAVA_HOME etc/hadoop/hadoop-env.sh
#export JAVA_HOME=${JAVA_HOME}
export JAVA_HOME=path_to_java
user@host:/opt/hadoop-2.8.0 $ source etc/hadoop/hadoop-env.sh

Maybe that (sourcing hadoop-env.sh) was implied in the posts above. Just thought someone should say it out loud. Now it runs. I've encountered other issues (due, I suspect, to the limited resources on the server I'm using), but at least I got past this one.

Upvotes: 1

Jo Kachikaran
Jo Kachikaran

Reputation: 582

Above answers should work as long as you are using default conf directory $HADOOP_HOME/conf or $HADOOP_HOME/etc/hadoop. Here are a few things you should do if you're using a different conf folder.

  1. Copy the hadoop-env.sh file from the default conf directory to your conf folder, say /home/abc/hadoopConf.
  2. Replace the line

    #export JAVA_HOME=${JAVA_HOME}
    

    with the following:

    export JAVA_HOME=/usr/lib/jvm/java-8-oracle
    export HADOOP_CONF_DIR=/home/abc/hadoopConf
    

Change the values appropriately. If you have any other environment variables related to hadoop configured in your .bashrc or .profile or .bash_profile consider adding them next to the above lines.

Upvotes: 0

hoang
hoang

Reputation: 1

It not know space between Program and Files: "Program Files". So, I copy folder of jdk to C: or folder which not contains space in name of folder and assign: export JAVA_HOME=Name_Path_Copied. I see it run ok

Upvotes: -1

toobee
toobee

Reputation: 2752

I had the same error and solved it with Soil Jain's remark, but to make it even a bit more clear: the hadoop-env.sh uses an expression such as

export JAVA_HOME=${JAVA_HOME}

if you hard-code the path to your JVM installation it works

export JAVA_HOME=/usr/lib/jvm/java...

this resolution by environmental variable as is seems to fail. Hard-coding fixed the problem for me.

Upvotes: 15

Sohil Jain
Sohil Jain

Reputation: 331

I debugged the code and found out that even though JAVA_HOME is set in the environment, the value is lost as ssh connections to other hosts is made inside the code, and the JAVA_HOME variable that was showing well set in start-dfs.sh became unset in hadoop-env.sh.

The solution to this problem will be to set JAVA_HOME variable in hadoop-env.sh and it should work properly.

Upvotes: 19

griffon vulture
griffon vulture

Reputation: 6774

Are you loading hadoop_env.sh? you may be refering to hadoop-env.sh ( dash instead of underscore - that is under conf directory)

BTW, This is a very useful guide for quick installation :

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

Upvotes: 2

Ankur Shanbhag
Ankur Shanbhag

Reputation: 7804

Under your HADOOP_HOME/conf directory please update the hadoop-env.sh file. It has entry to export JAVA_HOME.

Setting to appropriate JAVA_HOME in this file should solve your issue.

Upvotes: 6

Related Questions