crystyxn
crystyxn

Reputation: 1621

Class not found exception when trying to run JAR file with Spark

I am following Spark Quick Start tutorial page

I reached the last point, compiled my file to a JAR that should be ready to go.

Running my application from the terminal:

spark-submit --class "SimpleApp" --master local[4] /usr/local/spark/target/scala-2.11

Gives the following error:

2018-10-07 20:29:17 WARN  Utils:66 - Your hostname, test-ThinkPad-X230 resolves to a loopback address: 127.0.1.1; using 172.17.147.32 instead (on interface wlp3s0)
2018-10-07 20:29:17 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-10-07 20:29:17 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
java.lang.ClassNotFoundException: SimpleApp
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:239)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-10-07 20:29:18 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-10-07 20:29:18 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-08d94e7e-ae24-4892-a704-727a6caa1733

Why won't it find my SimpleApp class? I've tried giving it the full path. My SimpleApp.scala is in my root Spark folder, /usr/local/spark/

Upvotes: 0

Views: 1000

Answers (2)

ziad
ziad

Reputation: 404

Best way to deploy your app to spark is to use sbt assembly plugin. It will create a fat jar that contains all your dependencies. After packaging your app you have to point spark to the jar directly. Good luck.

Upvotes: 1

Rishu S
Rishu S

Reputation: 3968

Add your Spark JAR in your spark submit. A spark-submit submit looks as below:

./bin/spark-submit --class --master --deploy-mode

application-jar is the JAR file that you have build.

Hope this helps :)

Upvotes: 0

Related Questions