Reputation: 342
I am installing hadoop (3.2.2) on Ubuntu 18.04 on VMFusion for the first time. At the end of the installation when I run 'hdfs namenode -format' it shows: ERROR: Invalid HADOOP_COMMON_HOME
.
This is what I have on .bashrc:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME
export HADOOP_HDFS_HOME=HADOOP_HOME
export HADOOP_MAPRED_HOME=HADOOP_HOME
export YARN_HOME=HADOOP_HOME
export HADOOP_COMMON_HOME=HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=HADOOP_HOME/lib/native
export PATH=PATH:JAVA_HOME/bin:HADOOP_HOME/bin:HADOOP_HOME/sbin
export HADOOP_OPTS="HADOOP_OPTS -Djava.library.path=HADOOP_HOME/lib/native"
Thanks in advance for your advice.
Upvotes: 1
Views: 5696
Reputation: 1
I had the same error ("ERROR: Invalid HADOOP_COMMON_HOME") but because of different reasons. I had written well all the environment variables but inside the file hadoop-functions.sh, which is executed when using start-dfs.sh for example, it didn't detect the environment variable HADOOP_COMMON_HOME no matter how i tried to fix it. The solution I found was to hardcode these paths (it wasnt only hadoop_common_home who failed but also others hadoop variables) inside the file hadoop-functions.sh. I created all these variables just before the first error I got (line 251):
HADOOP_COMMON_HOME=/usr/local/hadoop
HADOOP_HDFS_HOME=/usr/local/hadoop
HADOOP_MAPRED_HOME=/usr/local/hadoop
HADOOP_YARN_HOME=/usr/local/hadoop
if [[ ! -d "${HADOOP_COMMON_HOME}" ]]; then
echo "${HADOOP_COMMON_HOME}"
hadoop_error "ERROR: Invalid HADOOP_COMMON_HOME"
exit 1
fi
This fixed my installation so I hope it's useful for anybody too.
Upvotes: 0
Reputation: 5541
Whenever you reference a variable you need to prefix with a $
, i.e.:
export HADOOP_HDFS_HOME=$HADOOP_HOME
Upvotes: 3