mxm
mxm

Reputation: 51

Spark 2.6.0: Exception in thread "main" java.lang.ClassNotFoundException: <Main Class> when <Main Class> is in jar submitted to cluster

I can run a Spark app that I wrote in Scala locally:

sbt run ...

and I have it running fine from the command line with no errors.

When I 'spark-submit' that same Spark app that I wrote in Scala to a 2.6.0 cluster, as follows:

spark-submit --class MyTest --master spark://my-spark-01a:7077 --deploy-mode cluster --supervise --executor-memory 20G --total-executor-cores 100 --jars $IGNITE_JARS,/home/ubuntu/tmp/mytest-assembly-0.3.1.1.jar  /home/ubuntu/tmp/mytest-assembly-0.3.1.1.jar

...I see the following error in stderr from the Spark Cluster

18/12/12 00:41:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/12/12 00:41:24 INFO spark.SecurityManager: Changing view acls to: ubuntu
18/12/12 00:41:24 INFO spark.SecurityManager: Changing modify acls to: ubuntu
18/12/12 00:41:24 INFO spark.SecurityManager: Changing view acls groups to: 
18/12/12 00:41:24 INFO spark.SecurityManager: Changing modify acls groups to: 
18/12/12 00:41:24 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(ubuntu); groups with view permissions: Set(); users  with modify permissions: Set(ubuntu); groups with modify permissions: Set()
18/12/12 00:41:24 INFO util.Utils: Successfully started service 'Driver' on port 50983.
18/12/12 00:41:24 INFO worker.WorkerWatcher: Connecting to worker spark://[email protected]:44303
Exception in thread "main" java.lang.ClassNotFoundException: MyTest

Class 'MyTest' is in my /home/ubuntu/tmp/mytest-assembly-0.3.1.1.jar

I just don't get it - it doesn't make sense. Why is Spark whining about my main class not found when it is in the jar I am deploying? What (else) am I forgetting to make this thing just plain work?

Please help.

Upvotes: 1

Views: 409

Answers (1)

Jonathan
Jonathan

Reputation: 121

As you're submitting in cluster mode, you may need to pass this jar to the driver and executors. Try the spark-submit again after adding;

--conf spark.driver.extraClassPath=/your/file.jar
--conf spark.executor.extraClassPath=/your/file.jar

Upvotes: 2

Related Questions