Reputation: 31
Im working on Scala/Spark project,i would like to export my project to jar file and run it into spark via spark-submit.
I tried this solution :
File -> Project Structure -> Artifacts -> + -> Jar -> From modules with dependencies -> Selected Main Class after browsing -> selected extract to the target jar -> Directory for META-INF automatically gets populated -> OK -> Apply -> OK -> Build -> Build Artifacts -> Build.
But i didn't find my main class on the jar file so i can't run it.
Upvotes: 1
Views: 1605
Reputation: 6739
The basic Idea that you can follow:
As you are working on Scala
You can use sbt
as your build management system to add all the dependencies to your project
You can use sbt assembly
pluggin to build fat jar
Export this fat jar into your cluster to submit the spark jobs.
pls use Google to get more details...
or you can use this project https://github.com/khodeprasad/spark-scala-examples to start with and integrate sbt assembly
plugin to create fat jars by following their documentation https://github.com/sbt/sbt-assembly
Upvotes: 4