user8261552
user8261552

Reputation:

Hadoop installation error, "error : cannot execute hdfs-config.sh."

I am following this tutorial to install hadoop in my computer. As far as I know, I have followed the instructions perfectly until source ~/.profile, but when I try to format HDFS by entering hdfs namenode -format, it gives me the following error :

ERROR: Cannot execute /usr/local/Cellar/hadoop/3.0.0/libexec/hdfs-config.sh

I tried a lot to look for the solution over the internet but didn't find a solution to it.

Upvotes: 6

Views: 18923

Answers (5)

Augustin MISAGO
Augustin MISAGO

Reputation: 1

I added the sudo command in front of hadoop-3.3.6/bin/hdfs namenode -format and it works.

Just like this:

$ sudo hadoop-3.3.6/bin/hdfs namenode -format

Upvotes: 0

Dieudonne NOAH MENGUE
Dieudonne NOAH MENGUE

Reputation: 11

One of the other reasons one may get this error is due to permissions not being available for running Hadoop in localhost. We usually configure SSH to avoid typing passwords or to avoid providing root permissions to Hadoop. In a simple way, we configure Hadoop to run in non-root mode. What can be done is:

Use sudo every time you want to use Hadoop or to correctly define the SSH Key,

Example:

$ ssh-keygen -t rsa -P ""
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Upvotes: 1

njjnex
njjnex

Reputation: 1555

Same issues for Hadoop 3.1.1 and above installed via Brew. HADOOP_HOME was not set up properly. Execute:

$ echo $HADOOP_HOME

And if you will see ”/usr/local/Cellar/hadoop” you have to add your spesific Hadoop version

$ export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1

Upvotes: 2

Shanemeister
Shanemeister

Reputation: 685

@BIKI I just ran into the same problem, and the Hadoop release 3.0.0 has a weird file structure that does not work with the home directory set the way you would think it should be.

I am on a MAC High Sierra OS 10.13, and installed using brew but I think you would see something similar on Ubuntu, or any UNIX-like system.

Bottom line, if you want to track down the errors, check your HADOOP_HOME in your profile (.bash_profile) and the scripts that are started when you kick off Hadoop. In my case, I have an alias set in my profile called hstart and it calls the following files:

start-dfs.sh

AND

start-yarn.sh

These files call the hdfs-config.sh file which gets lost given the home directory setting.

My Hadoop home directory was set to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0

And I changed it to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0/libexec

Of course you have to source your configuration profile, and in my case it was:

source .bash_profile

For me, this did the trick. Hope that helps!

Upvotes: 14

Rajeev
Rajeev

Reputation: 5002

Looks like latest version has issues with Brew. I tried directly downloading version Hadoop-2.8.1 from here.

Follow the same instructions. It is working.

Upvotes: 3

Related Questions