Ahmed Radhwen Khadhri
Ahmed Radhwen Khadhri

Reputation: 31

How to export Scala Spark project to jar

Im working on Scala/Spark project,i would like to export my project to jar file and run it into spark via spark-submit.
I tried this solution :
File -> Project Structure -> Artifacts -> + -> Jar -> From modules with dependencies -> Selected Main Class after browsing -> selected extract to the target jar -> Directory for META-INF automatically gets populated -> OK -> Apply -> OK -> Build -> Build Artifacts -> Build.
But i didn't find my main class on the jar file so i can't run it.

Upvotes: 1

Views: 1605

Answers (1)

Prasad Khode
Prasad Khode

Reputation: 6739

The basic Idea that you can follow:

As you are working on Scala

  • You can use sbt as your build management system to add all the dependencies to your project

  • You can use sbt assembly pluggin to build fat jar

  • Export this fat jar into your cluster to submit the spark jobs.

pls use Google to get more details...

or you can use this project https://github.com/khodeprasad/spark-scala-examples to start with and integrate sbt assembly plugin to create fat jars by following their documentation https://github.com/sbt/sbt-assembly

Upvotes: 4

Related Questions