pg2455
pg2455

Reputation: 5178

Unable to locate Spark Jar: Java ClassNotFoundException

I am installing Spark on Ubuntu server.I have followed all the steps and it even said BUILD SUCCESSFUL in the end but when I am running ./bin/spark-shell it is giving me this error.enter image description here

It probably means that it is not able to locate .jar file which has location of ./spark-1.4.1/launcher/src/main/java/org/apache/spark/launcher which has all the java files like Main.java.

Also there is nothing in $CLASSPATH and $SPARK_CLASSPATH. I installed Spark in Linux and Mac before and was not faced with this problem. Can someone tell me what can be the problem here? Probably I need to specify classpath or some environment variable to point to a jar which contains all the class files.

My JAVA_HOME points to /jvm/java-6-openjdk-amd64/jre. Is there any problem with this?

EDIT: I tried few more things. I wrote a shell script to find out the jar file which contains the org/apache/spark/launcher/Main.class file and found out that it is located at : /usr/local/src/spark/spark-1.4.1/launcher/target/spark-launcher_2.10-1.4.1.jar. I changed my CLASSPATH and SPARK_CLASSPATH to the same location and tried running Spark. It gave me same error.

I also changed ./conf/spark-env.sh to incorporate different SPARK_CLASSPATH. It also didn't work.

Upvotes: 3

Views: 6013

Answers (2)

pg2455
pg2455

Reputation: 5178

SO after a lot of research and experiments, I found that I had been using wrong version of JDK. I should have been using JDK1.7 but I had been using default JAVA_HOME in ubuntu which is JDK-1.6.

Thus, install jdk1.7 and then point your java_home to it. Works fine after that.

Upvotes: 6

joecoder
joecoder

Reputation: 335

Are you running the spark-submit script from SPARK_HOME or another directory? The spark-submit script assumes you are running it from the SPARK_HOME directory.

Try setting the classpath to: export CLASSPATH=${SPARK_HOME}/lib:$CLASSPATH and try again.

Upvotes: 1

Related Questions