sirine
sirine

Reputation: 527

Cannot load main class from JAR file

I have a Spark-scala application. I tried to display a simple message - "Hello my App". When I compile it with sbt compile and run it by sbt run it's fine. I displayed my message with success but he display an error; like this:

Hello my application!
16/11/27 15:17:11 ERROR Utils: uncaught error in thread SparkListenerBus,   stopping SparkContext
        java.lang.InterruptedException
     ERROR ContextCleaner: Error in cleaning thread
    java.lang.InterruptedException
     at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
    16/11/27 15:17:11 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
    [success] Total time: 13 s, completed Nov 27, 2016 3:17:12 PM
    16/11/27 15:17:12 INFO DiskBlockManager: Shutdown hook called

I can't understand whether it's fine or not! Also when I try to load my file jar after the run, it displays an error.

My command line look like:

spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

And the error is:

Error: Cannot load main class from JAR file:/root/projectFilms/appfilms
Run with --help for usage help or --verbose for debug output
16/11/27 15:24:11 INFO Utils: Shutdown hook called

Please can you answer me!

Upvotes: 8

Views: 40055

Answers (2)

user7097216
user7097216

Reputation:

You forgot to use --class Parameter spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

spark-submit --class "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar.

Please note if appfilm belong to any package dont forgot to add package name as below packagename.appfilms

I believe this will suffice

Upvotes: 1

Paul Velthuis
Paul Velthuis

Reputation: 335

The error is due to the fact that the SparkContext is not stopped, this is required in versions higher than Spark 2.x. This should be stopped to prevent this error by SparkContext.stop(), or sc.stop(). Inspiration for solving this error is gained from own experiences and the following sources: Spark Context, Spark Listener Bus error

Upvotes: 7

Related Questions