Tejas
Tejas

Reputation: 257

Running Hadoop in multi-node cluster not working

I had a HDFS installed and working on 3 computers .then i tried to add 5 more PCs to the existing Cluster but after doing so .when i tried to start hadoop on Master node i got this error mentioned below.

[hduser@dellnode1 ~]$ start-all.sh
starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.out
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:207)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
dellnode3.pictlibrary: datanode running as process 4856. Stop it first.
dellnode1.pictlibrary: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-datanode-dellnode1.pictlibrary.out
dellnode2.pictlibrary: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-datanode-dellnode2.pictlibrary.out
dellnode1.pictlibrary: starting secondarynamenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-secondarynamenode-dellnode1.pictlibrary.out
starting jobtracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-jobtracker-dellnode1.pictlibrary.out
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /usr/local/hadoop/bin/../logs/hadoop-hduser-jobtracker-dellnode1.pictlibrary.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:207)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
dellnode3.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode3.pictlibrary.out
dellnode1.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode1.pictlibrary.out
dellnode2.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode2.pictlibrary.out

All pc's running Fedora 17

Upvotes: 0

Views: 1203

Answers (1)

Angelos Kapsimanis
Angelos Kapsimanis

Reputation: 999

I would create the log file manually with something like

sudo touch  /usr/local/hadoop/bin/logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

If your path is

/usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

better fix as I wrote in the first line.

Then change the file's permissions:

sudo chmod 750 /usr/local/hadoop/bin/logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

and try again. It should work this time ;-)

Upvotes: 1

Related Questions