ylun
ylun

Reputation: 2534

Hadoop: start-dfs/start-yarn.sh: No such file or directory

It is giving me the two above errors despite me checking that both files exist in the directory and that hadoop has access to the folders.

I installed hadoop using the following tutorial: link

Whats going wrong and how can this be fixed?

Upvotes: 0

Views: 2373

Answers (2)

ruchita
ruchita

Reputation: 163

I was also getting the same error, but what I did wrong was copying the wrong path in ~/.profile file : alias hstart="/usr/local/Cellar/hadoop/2.6.0/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/start-yarn.sh" alias hstop="/usr/local/Cellar/hadoop/2.6.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/stop-dfs.sh"

In my case it was 3.0.0, so the path should be /usr/local/Cellar/hadoop/3.0.0/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/3.0.0/sbin/start-yarn.sh

Upvotes: 1

alekya reddy
alekya reddy

Reputation: 934

You might not have exported the path of this directory. Try giving /entirepath/start-dfs.sh.

Also in your .bash_rc file add HADOOP_HOME=/Pathtohadoopinstallationfolder. Give the command source .bash_rc to source the bash_rc file.

Upvotes: 2

Related Questions