Reputation: 1320
When I try to compile my program in Hadoop with this command
bin/hadoop com.sun.tools.javac.Main WordCounter.java
from Hadoop folder, it says
Error: Could not find or load main class com.sun.tools.javac.Main
I looked in similar threads where people suggested to check if JAVA_HOME
is properly stated. So in etc/hadoop/hadoop-env.sh
I added this line
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
then checked if tools.pack
is properly unpacked in /usr/lib/jvm/java-7-openjdk-amd64/lib
and it was. Then I tried javac -version
which gave
javac 1.7.0_65
I tried to reinstall Java but it didn't solve the problem.
Upvotes: 11
Views: 20602
Reputation: 159
I had to downgrade Hadoop to 2.9.2 and it's working.
I also had these in my environment:
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk
export PATH=${JAVA_HOME}/bin:${PATH}
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
Upvotes: 0
Reputation: 2431
Try to set HADOOP_CLASSPATH environment variable
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar
Upvotes: 17
Reputation: 328564
The error means you don't use a JDK to start Hadoop. The main difference between the JRE (pure runtime) and the JDK is the Java compiler javac
. To see if you have a Java compiler, you need to check two places: There should be a javac
in the $JAVA_HOME/bin
folder plus there must be a file $JAVA_HOME/lib/tools.jar
.
In your case, the first one (the binary to start the compiler) can be missing but you absolutely need the tools.jar
.
You say that you have a tools.pack
but I haven't heard about this one before. Use your package manager to search for openjdk
and then look for a package in the result list which says jdk
. On my system, that would be openjdk-7-jdk
. Install this package and the error should go away.
Upvotes: 0