How to add library to spark shell

I have a library that I want to use in spark shell, how can I add this library to be accessible from the spark shell? sbt :

resolvers += Resolver.bintrayRepo("unsupervise", "maven")
libraryDependencies += "com.github.unsupervise" %% "spark-tss" % "0.1.1"

maven :

<dependencies>
<!-- Other Dependencies -->
    <dependency>
        <groupId>com.github.unsupervise</groupId>
        <artifactId>spark-tss_2.11</artifactId>
        <version>0.1.1</version>
    </dependency>
</dependencies>
<repositories>
<!-- Other Repositories ... -->
    <repository>
        <id>bintrayunsupervisemaven</id>
        <name>bintray-unsupervise-maven</name>
        <url>https://dl.bintray.com/unsupervise/maven/</url>
        <layout>default</layout>
    </repository>
</repositories>

Upvotes: 1

Views: 2234

Answers (2)

Aravind Yarram
Aravind Yarram

Reputation: 80186

When you have jars locally:

./spark-shell --jars pathOfjarsWithCommaSeprated

When the binary artifacts (jars) are managed through an artifact repository like Maven or Nexus then you would use artifact coordinates like group id, artifact id and version.

Reference: http//spark.apache.org/docs/latest/rdd-programming-guide.html#using-the-shell

./bin/spark-shell --master local[4] --packages "com.github.unsupervise:spark-tss_2.11:0.1.1"

Upvotes: 1

Jasper-M
Jasper-M

Reputation: 15086

Use the repositories and packages parameters.

spark-shell \
  --repositories "https://dl.bintray.com/unsupervise/maven" \
  --packages "com.github.unsupervise:spark-tss_2.11:0.1.1"

Upvotes: 2

Related Questions