Reputation: 1776
I created a jar package from a project by this file-tree:
build.sbt
src/main
src/main/scala
src/main/scala/Tester.scala
src/main/scala/main.scala
where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention) created a jar file by sbt successfully and worked well in spark-submit
now I wanna add it into spark-shell and use Tester class as a class to create objects and ... I added the jar file into spark-default.conf but:
scala> val t = new Tester();
<console>:23: error: not found: type Tester
val t = new Tester();
Upvotes: 47
Views: 71927
Reputation: 714
I tried two options and both worked for me.
spark-shell --jars <path of jar>
open spark-shell -Type :help,you will get all the available help. use below to add
:require /full_path_of_jar
Upvotes: 1
Reputation: 3692
you can try by providing jars with argument as below
./spark-shell --jars pathOfjarsWithCommaSeprated
Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults
spark.driver.extraClassPath pathOfJarsWithCommaSeprated
Upvotes: 75
Reputation: 3745
If you want to add a .jar to the classpath after you've entered spark-shell, use :require
. Like:
scala> :require /path/to/file.jar
Added '/path/to/file.jar' to classpath.
Upvotes: 43