giaosudau
giaosudau

Reputation: 2251

Hadoop error: mkdir: cannot create directory `/var/run/hadoop'

I've setup a Hadoop in my laptop single mode. Info: Ubuntu 12.10, jdk 1.7 oracle, install Hadoop from .deb file.

Location:

/etc/hadoop
/usr/share/hadoop

I have config in /usr/share/hadoop/templates/conf/core-site.xml I add 2 properties

<property>
  <name>hadoop.tmp.dir</name>
  <value>/app/hadoop/tmp</value>
  <description>A base for other temporary directories.</description>
</property>

<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:9000</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
</property>

in hdfs-site.xml

<property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
</property>

in mapred-site.xml

    <property>
  <name>mapred.job.tracker</name>
  <value>localhost:9001</value>
  <description>The host and port that the MapReduce job tracker runs
  at.  If "local", then jobs are run in-process as a single map
  and reduce task.
  </description>
</property>

when I start with command hduser@sepdau:~$ start-all.sh

starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out

but when I view process by jps

hduser@sepdau:~$ jps
13725 Jps

more

 root@sepdau:/home/sepdau# netstat -plten | grep java
tcp6       0      0 :::8080                 :::*                    LISTEN      117        9953        1316/java       
tcp6       0      0 :::53976                :::*                    LISTEN      117        16755       1316/java       
tcp6       0      0 127.0.0.1:8700          :::*                    LISTEN      1000       786271      8323/java       
tcp6       0      0 :::59012                :::*                    LISTEN      117        16756       1316/java  

when I stop-all.sh

hduser@sepdau:~$ stop-all.sh
no jobtracker to stop
localhost: no tasktracker to stop
no namenode to stop
localhost: no datanode to stop
localhost: no secondarynamenode to stop

in my hosts file

hduser@sepdau:~$ cat /etc/hosts

127.0.0.1       localhost
127.0.1.1   sepdau.com



# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

file slave : localhost master: localhost

here is some log

    hduser@sepdau:/home/sepdau$ start-all.sh
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-namenode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-datanode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-secondarynamenode.pid: No such file or directory
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-jobtracker.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-tasktracker.pid: No such file or directory

I use with root user but it have same problem

What am I doing wrong in here? How to connect to eclipse with Hadoop plugin.

Upvotes: 0

Views: 15934

Answers (4)

Venu A Positive
Venu A Positive

Reputation: 3062

Restart the Terminal, Format the NameNode first.

Some rare conditions someone changed Start-all.sh file in Bin folder in Hadoop. Check it once.

Check once bashrc file configuration is good or not?

Upvotes: 0

Minh Ha Pham
Minh Ha Pham

Reputation: 2596

Modify your hdfs-site.xml

<property>
  <name>dfs.name.dir</name>
  <value>/home/user_to_run_hadoop/hdfs/name</value>
</property>

<property>
  <name>dfs.data.dir</name>
  <value>/home/user_to_run_hadoop/hdfs/data</value>
</property>

Make sure to create directory hdfs at /home/user_to_run_hadoop. Then create 2 directories name and data on hdfs

After that you need to chmod -R 755 ./hdfs/ and path_to_hadoop_home/bin/hadoop namenode -format

Upvotes: 1

Shad Amez
Shad Amez

Reputation: 428

You can add the path where pid and the logs created by editing the file hadoop-env.sh . This file is stored in conf folder.

export HADOOP_LOG_DIR=/home/username/hadoop-1x/logs

export HADOOP_PID_DIR=/home/username/pids

Upvotes: 0

abhinav
abhinav

Reputation: 1282

Try adding

<property>
  <name>dfs.name.dir</name>
   <value>/home/abhinav/hdfs</value>
 </property>

to hdfs-site.xml and make sure that it exists

I have writted a small tutorial for this. See if this helps http://blog.abhinavmathur.net/2013/01/experience-with-setting-multinode.html

Upvotes: 2

Related Questions