SkyWalker
SkyWalker

Reputation: 14309

Hadoop 2.2.0 fails running start-dfs.sh with Error: JAVA_HOME is not set and could not be found

I have a work in progress installation of Hadoop in Ubuntu 12.x. I already had a deploy user which I plan to use to run hadoop in a cluster of machines. The following code demonstrate my problem basically I can ssh olympus no problems but start-dfs.sh fails doing exactly that:

deploy@olympus:~$ ssh olympus
Welcome to Ubuntu 12.04.4 LTS (GNU/Linux 3.5.0-45-generic x86_64)

    * Documentation:  https://help.ubuntu.com/

Last login: Mon Feb  3 18:22:27 2014 from olympus
deploy@olympus:~$ echo $JAVA_HOME
/opt/dev/java/1.7.0_51

deploy@olympus:~$ start-dfs.sh
Starting namenodes on [olympus]
olympus: Error: JAVA_HOME is not set and could not be found.

Upvotes: 19

Views: 14066

Answers (5)

Vikas Hardia
Vikas Hardia

Reputation: 2695

You can edit hadoop-env.sh file and set JAVA_HOME for Hadoop

Open the file and find the line as bellow

export JAVA_HOME=/usr/lib/j2sdk1.6-sun

Uncomment the line And update the java_home as per your environment

This will solve the problem with java_home.

Upvotes: 31

benjaminedwardwebb
benjaminedwardwebb

Reputation: 133

Alternatively you can edit /etc/environment to include:

JAVA_HOME=/usr/lib/jvm/[YOURJAVADIRECTORY]

This makes JAVA_HOME available to all users on the system, and allows start-dfs.sh to see the value. My guess is that start-dfs.sh is kicking off a process as another user somewhere that does not pick up the variable unless explicitly set in hadoop-env.sh.

Using hadoop-env.sh is arguably clearer -- just adding this option for completeness.

Upvotes: 3

ozw1z5rd
ozw1z5rd

Reputation: 3208

I have hadoop installed on /opt/hadoop/ and java is installed into /usr/lib/jvm/java-8-oracle At the end adding this into the bash profile files, I solved any problem.

export JAVA_HOME=/usr/lib/jvm/java-8-oracle  
export HADOOP_HOME=/opt/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME 
export HADOOP_COMMON_HOME=$HADOOP_HOME 
export HADOOP_HDFS_HOME=$HADOOP_HOME 
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native 
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_ROOT_LOGGERi=INFO,console
export HADOOP_SECURITY_LOGGER=INFO,NullAppender
export HDFS_AUDIT_LOGGER=INFO,NullAppender
export HADOOP_INSTALL=$HADOOP_HOME
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
export HADOOP_YARN_HOME=$HADOOP_HOME
export YARN_LOG_DIR=/tmp

Upvotes: 0

nikk
nikk

Reputation: 2877

Edit the Hadoop start-up script /etc/hadoop/hadoop-env.sh by setting the JAVA_PATH explicitly.

For example: Instead of export JAVA_HOME=${JAVA_HOME}, do

export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.65-3.b17.el7.x86_64/jre

This is with the Java version, java-1.8.0-openjdk.

Upvotes: 0

Ganesh Krishnan
Ganesh Krishnan

Reputation: 7395

Weird out of the box bug on Ubuntu. The current line

export JAVA_HOME=${JAVA_HOME}

in /etc/hadoop/hadoop-env.sh should pick up java home from host but it doesnt.

Just edit the file and hard code the java home for now.

Upvotes: 12

Related Questions