Micah
Micah

Reputation: 116090

Having issue setting up Hadoop

The issue I'm having is that when I run bin/hadoop fs -ls it prints out all the files of the local directory that I'm in and not the files in hdfs (which currently should be none). Here's how I set everything up:

I've downloaded and unzipped all the 0.20.2 files into /home/micah/hadoop-install/. I've edited my conf/hdfs-site.xml with the following settings and created the appropriate directories:

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>localhost:9000</value>
  </property>
  <property>
    <name>dfs.data.dir</name>
    <value>/home/micah/hdfs/data</value>
  </property>
  <property>
    <name>dfs.name.dir</name>
    <value>/home/micah/hdfs/name</value>
  </property>
</configuration>

I then ran bin/hadoop namenode -format followed by bin/start-dfs.sh.

Upvotes: 2

Views: 5668

Answers (3)

rajeev
rajeev

Reputation: 11

Thanks , Doing the following resolved my issue

rm -r /tmp/hadoop****
    build $HADOOP_HOME
    echo export JAVA_HOME=$JAVA_HOME >> $HADOOP_HOME/conf/hadoop-env.sh
    echoThenRun "$HADOOP_HOME/bin/stop-all.sh"
    echoThenRun "$HADOOP_HOME/bin/hadoop namenode -format"

Upvotes: 1

Josh Hansen
Josh Hansen

Reputation: 1448

I had a similar issue and found that my HDFS data directory permissions were wrong.

Removing group write privileges with chmod -R g-w from the data directory fixed the problem.

Upvotes: 1

simpatico
simpatico

Reputation: 11087

Try this:

    #http://www.mail-archive.com/[email protected]/msg00407.html
    rm -r /tmp/hadoop****
    build $HADOOP_HOME
    echo export JAVA_HOME=$JAVA_HOME >> $HADOOP_HOME/conf/hadoop-env.sh
    echoThenRun "$HADOOP_HOME/bin/stop-all.sh"
    echoThenRun "$HADOOP_HOME/bin/hadoop namenode -format"

Upvotes: 4

Related Questions