Kagestrife
Kagestrife

Reputation: 79

Install a .jar in Spark

I am pretty new to Spark and Scala at the same time, so some things need clarification. I went to the web looking for a definitive answer to my question, but I did not really end up with one.

At the moment, I am running the spark-shell, in order to write some basic Scala and complete my tutorials. Now, the tutorial wants me to add a library in spark, in order to import it and use it for the examples. I have downloaded the .jar file of the library. Should I put in the /spark/jars/ folder? Is this enough in order to import it or should I also declare it somewhere else as well? Do I need to add a command before running the ./spark-shell ?

Also, when I create a standalone program (using sbt and declaring the library in the build.sbt), will the spark find the .jar in the /spark/jars/ folder or do I need to put it elsewhere?

Upvotes: 0

Views: 2145

Answers (1)

evan.oman
evan.oman

Reputation: 5562

Any jar can be added to spark-shell by using the --jars command:

evan@vbox:~> cat MyClass.java
public class MyClass
{
    public static int add(int x, int y)
    {
        return x + y;
    }
}
evan@vbox:~> javac MyClass.java
evan@vbox:~> jar cvf MyJar.jar MyClass.class
added manifest
adding: MyClass.class(in = 244) (out= 192)(deflated 21%)
evan@vbox:~> spark --jars ./MyJar.jar
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> MyClass.add(2,3)
res0: Int = 5

If you are going to be making a project using sbt which has dependencies, I would recommend making an "uber jar" with sbt assembly. This will create a single JAR file which includes all of your dependencies, allowing you to just add a single jar using the above command.

Upvotes: 2

Related Questions