Mina F
Mina F

Reputation: 67

Problems using start-dfs.sh

I have used this link to create a 4 node cluster: https://blog.insightdatascience.com/spinning-up-a-free-hadoop-cluster-step-by-step-c406d56bae42, but once I reach the part to start the hadoop cluster I get errors like so:

$HADOOP_HOME/sbin/start-dfs.sh

Starting namenodes on [namenode_dns]
namenode_dns: mkdir: cannot create 
directory ‘/usr/local/hadoop/logs’: Permission denied
namenode_dns: chown: cannot access 
'/usr/local/hadoop/logs': No such file or directory
namenode_dns: starting namenode, logging 
to /usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
namenode_dns: head: cannot open 
'/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out' 
for reading: No such file or directory
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
ip-172-31-1-82: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-1-82.out
ip-172-31-7-221: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-7-221.out
ip-172-31-14-230: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-14-230.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: 
Permission denied
0.0.0.0: chown: cannot access '/usr/local/hadoop/logs': No such file 
or directory
0.0.0.0: starting secondarynamenode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: head: cannot open '/usr/local/hadoop/logs/hadoop-ubuntu-
secondarynamenode-ip-172-31-2-168.out' for reading: No such file or 
directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory

Here is what happens when I run jps:

20688 Jps

I'm not sure where I went wrong with the configuration and such. I am new to hadoop and map reduce so please keep it simple.

Upvotes: 1

Views: 1337

Answers (1)

SachinJose
SachinJose

Reputation: 8522

It's a permission related issue, Looks like the user(I'nk it's ubuntu) you are using to start hadoop services doesn't have write permission in the log directory(/usr/local/hadoop) - You would've copied hadoop files as sudo/root. Try to change Hadoop Home directory ownership recursively or Give write access to /usr/local/hadoop/logs directory.

chown -R ububunt:ubuntu /usr/local/hadoop  

or

chmod 777 /usr/local/hadoop/logs

Upvotes: 2

Related Questions