sirine
sirine

Reputation: 527

create project jar in scala

I have a self-contained application in SBT. My data is stored on HDFS (the hadoop file system).How can I get a jar file to run my work on another machine.

The directory of my project is the following:

/MyProject
   /target
        /scala-2.11
                 /MyApp_2.11-1.0.jar
   /src
      /main
           /scala

Upvotes: 1

Views: 1401

Answers (2)

Komal BL Yogi
Komal BL Yogi

Reputation: 26

Unlike Java, in Scala, the file’s package name doesn’t have to match the directory name. In fact, for simple tests like this, you can place this file in the root directory of your SBT project, if you prefer.

From the root directory of the project, you can compile the project:

$ sbt compile Run the project:

$ sbt run Package the project:

$ sbt package

Here is link to understand: http://alvinalexander.com/scala/sbt-how-to-compile-run-package-scala-project

Upvotes: 1

marios
marios

Reputation: 8996

If you don't have any dependencies then running sbt package will create a jar will all your code.

You can then run your Spark app as:

$SPARK_HOME/bin/spark-submit --name "an-app" my-app.jar  

If your project has external dependencies (other than spark itself; if it's just Spark or any of it's dependencies, then the above approach still works), then you have two options:

1) Use the sbt assembly plugin to create an uper jar with your entire class-path. Running sbt assembly will create another jar which you can use in the same way as before.

2) If you only have very few simple dependecies (say just joda-time), then you can simply include them into your spark-submit script.

$SPARK_HOME/bin/spark-submit --name "an-app" --packages "joda-time:joda-time:2.9.6" my-app.jar 

Upvotes: 3

Related Questions