Santhosh
Santhosh

Reputation: 151

Could not find or load main class org.apache.hadoop.hdfs.server.namenode.Namenode

I know this has been asked before but I could not figure out the solution. I am getting the below error when I am trying to run hdfs name node -format:

Could not find or load main class org.apache.hadoop.hdfs.server.namenode.Namenode

I followed the instructions from this website to install on my centos machine. The only difference is that I installed using root instead of hadoopuser as mentioned in the link.

Bashrc

# User specific aliases and functions

export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
export HADOOP_INSTALL=/usr/local/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export PATH=$PATH:$HADOOP_INSTALL/sbin
export PATH=$PATH:$HADOOP_INSTALL/bin

hadoop-env.sh

export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/

Mapred

<property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
</property>

Yarn-site.xml

<property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
</property>

core-site.xml

<property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
</property>

hdfs-site.xml

<property>
    <name>dfs.replication</name>
    <value>1</value>
</property>

<property>
    <name>dfs.name.dir</name>
    <value>file:///home/hadoopspace/hdfs/namenode</value>
</property>

<property>
    <name>dfs.data.dir</name>
    <value>file:///home/hadoopspace/hdfs/datanode</value>
</property>

Upvotes: 15

Views: 36794

Answers (14)

Ha Ngo
Ha Ngo

Reputation: 23

I solved this problem by modifying the path in .bashrc:

export PATH=$HADOOP_HOME/bin:$PATH

Upvotes: 0

Tunde Pizzle
Tunde Pizzle

Reputation: 827

Could be a classpath issue.

Add the following to your ~/.bashrc

 export HADOOP_CLASSPATH=$(cygpath -pw $(hadoop classpath)):$HADOOP_CLASSPATH

Upvotes: 0

abelito
abelito

Reputation: 1104

For Hadoop v3.1.2 on Windows 7, I had to

  1. Install Cygwin (per the instructions).

  2. Set the following environmental variables, noticing that those are FORWARD slashes (/) instead of BACK slashes (\):

    HADOOP_HOME=D:/.../hadoop-3.1.2
    
    JAVA_HOME=D:/.../java-1.8.0-openjdk-1.8.0.201-2.b09.redhat.windows.x86_64
    
  3. Re-open Cygwin and CD into my hadoop directory -- must re-open to pick up the new environmental variables. Feel free to use ~/.bashrc and export HADOOP_HOME=... etc to do this too.

  4. Make sure you type the following exactly:

    ./bin/hdfs.cmd namenode -format

    It must be .cmd or else it won't work on Windows.

After that it worked perfectly. If you're still having trouble, dig into the hdfs.cmd file and add some echo calls to print out what it's running, especially near the java call to see exactly what it is executing.

Upvotes: 0

Diya Krishna
Diya Krishna

Reputation: 1

The error is due to missing hadoop hdfs jar files in the hadoop classpath. Type 'hadoop classpath' in the terminal and check whether the hdfs jar files is present or not. If not paste the below line in the .bashrc and save it and source it.

export HADOOP_CLASSPATH=new-classpath:$HADOOP_CLASSPATH

You can create the new classpath by adding the location to your hdfs jar files at the end of existing classpath and replace the section 'new-classpath' with your own.

Upvotes: 0

Aswita Hidayat
Aswita Hidayat

Reputation: 159

Make sure your hdfs path is correct by using which

which hdfs

Upvotes: 0

Vkreddy
Vkreddy

Reputation: 1718

Add

export HADOOP_PREFIX=/path/to/hadoop/installation/directory

at the end of etc/hadoop/hadoop-env.sh file in the hadoop installation directory. Which allows jvm to locate the class files.

Upvotes: 0

Sachin Garg
Sachin Garg

Reputation: 11

I faced the same issue.

restart the terminal and try executing the command.

Terminal restart is required to make immediate effect of path variable set inside .bashrc file

Upvotes: 1

typ64
typ64

Reputation: 33

I had this error also. For me the problem was there were missing files that were not extracted during the initial unzipping process.

What worked for me is going into the location of your .tar.gz file and unzipping it again using:

tar xvzf <file_name>.tar.gz

Be advised this overrides all of your saved files so if you have made changes to any files, it would be best to create a copy of your Hadoop folder prior to unzipping.

Upvotes: 0

Eduardo Sanchez-Ros
Eduardo Sanchez-Ros

Reputation: 1827

I tried the suggestions above but I still got the same error. Setting the HADOOP_CLASSPATH as follows in your ~/.bashrc worked for me:

export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH

Upvotes: 0

unds
unds

Reputation: 179

Thanks dcsesq.

brew installed hadoop 2.6.0 on Mac OS 10.9.5 (maverick)

Add the following env variables to .profile ~/.bash_profile

export HADOOP_HOME=/usr/local/Cellar/hadoop/2.6.0
export HADOOP_PREFIX=$HADOOP_HOME/libexec
export HADOOP_MAPRED_HOME=$HADOOP_PREFIX
export HADOOP_COMMON_HOME=$HADOOP_PREFIX
export HADOOP_HDFS_HOME=$HADOOP_PREFIX
export YARN_HOME=$HADOOP_PREFIX

Source the .profile file

source ~/.bash_profile

Run namenode

hdfs namenode -format

Boom started, bothered me almost half a day.

Upvotes: 12

Nitesh Chaturvedi
Nitesh Chaturvedi

Reputation: 37

check and set the value of HADOOP_PREFIX to $HADOOP

Upvotes: -2

Mike S
Mike S

Reputation: 11409

For anyone still having trouble, you need to export the HADOOP_PREFIX environment variable.

Add the following line to your ~/.bashrc file:

export HADOOP_PREFIX=/path_to_hadoop_location

# for example:
# export HADOOP_PREFIX=/home/mike/hadoop-2.7.1

Then do . ~/.bashrc in your terminal and try again, this will fix the error.

Upvotes: 12

Jerry Ragland
Jerry Ragland

Reputation: 621

Looks like when you execute hadoop command, not all classes (jars) are included in your classpath. Your classpath is missing hadoop-hdfs-<version>.jar file.

Upvotes: 1

SachinJose
SachinJose

Reputation: 8522

Try to use the following command for formatting (no space between name and node & replace hdfs command with hadoop)

hadoop namenode -format

Upvotes: 4

Related Questions