L. Pardus
L. Pardus

Reputation: 13

Writing a Java JAR for spark

Sorry if this seems like a silly question. I need to write a very simple String processing function for someone else's Spark application, but I have limited/no experience with Spark. I was told that I could just write it in Java and deliver the jar file.

I'm a bit confused on how the design would look like though? Would it work if I just create a standard class that contains the method (without any Spark-specific code)? How would this be initialized/called from the (Scala) Spark application after the jar is imported?

Upvotes: 1

Views: 999

Answers (1)

evan.oman
evan.oman

Reputation: 5562

No need to add any Spark specifics (unless you need to use Spark classes). Here is an example:

evan@vbox:~> cat MyClass.java
public class MyClass
{
    public static int add(int x, int y)
    {
        return x + y;
    }
}
evan@vbox:~> javac MyClass.java
evan@vbox:~> jar cvf MyJar.jar MyClass.class
added manifest
adding: MyClass.class(in = 244) (out= 192)(deflated 21%)
evan@vbox:~> spark --jars ./MyJar.jar
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> MyClass.add(2,3)
res0: Int = 5

In this case Scala was able to use scala.Int as the Java primitive int so there was no Scala/Java interoperability considerations. Depending on your function you may need to think about this, but that is a Scala-Java issue, not a Spark issue.

Upvotes: 2

Related Questions