Reputation: 31
After I successfully created the name node, I ran into this problem when trying to start name node. For me it seems as if it's trying to log to a file that does not exist. How could I change my setup to direct the script log to the correct directory?
bash-3.2$ start-all.sh
starting namenode, logging to /usr/local/bin/../logs/hadoop-Yili-namenode-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting datanode, logging to /usr/local/bin/../logs/hadoop-Yili-datanode-
wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting secondarynamenode, logging to /usr/local/bin/../logs/hadoop-Yili-
secondarynamenode-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
starting jobtracker, logging to /usr/local/bin/../logs/hadoop-Yili-jobtracker-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting tasktracker, logging to /usr/local/bin/../logs/hadoop-Yili-
tasktracker-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
Upvotes: 3
Views: 9312
Reputation: 474
Try to run which hadoop
. If this command gives you an output then your HADOOP_HOME has been set in .bashrc file.
If not set then edit .bashrc file in your home directory and add below statements considering your hadoop is installed in /opt/hadoop
. It may be another location.
HADOOP_HOME=/opt/HADOOP
export HADOOP_HOME
PATH=$PATH:$HADOOP_HOME/bin
export PATH
This will help you.
Upvotes: 2