Himanshu Srivastava
Himanshu Srivastava

Reputation: 95

spark-shell - Not able to access java functions in jars

I have started exploring spark 2 days back. So I am pretty new to it. My use case is around accessing a java function present in an external jar in my scala code which I am writing in spark-shell. But I think I am not loading my jar properly. Here is what I am doing

spark-shell --master local[2] --jars pathToJarsWithComma --conf="spark.driver.extraClassPath=pathToJarsWithColon" --conf="spark.executor.extraClassPath=pathToJarsWithColon"

This is how I launch my spark-shell with all the required jars being passed. And whenever I am trying to call the java static function like :

rdd1.collect.foreach(a=>MyClass.myfuncttion(a))

I am getting error as :

<console>:26: error: not found: value MyClass

I want to know if my understanding is correct. Can we use java functions in spark by loading external jars. If yes, then what I am doing wrong here. Please guide.

Upvotes: 0

Views: 1306

Answers (1)

mkhan
mkhan

Reputation: 621

We can load java functions in Spark by loading external jars. I am not sure whether you need the confs you added at the end to make this work. For me, I tried out the following to test loading a jar in a spark shell.

./bin/spark-shell --master <spark url>  --jars /home/SparkBench/Terasort/target/jars/guava-19.0-rc2.jar

After that in the shell, I tried to access a field from a class in the jar.

scala> import com.google.common.primitives.UnsignedBytes
import com.google.common.primitives.UnsignedBytes
scala> UnsignedBytes.MAX_POWER_OF_TWO
res0: Byte = -128

As you can see, I was able to access fields from the external jar. You can also test out whether you can access the Class by a simple field from it.

Upvotes: 0

Related Questions