tapas kumar Mahanta
tapas kumar Mahanta

Reputation: 61

How to add external jar files to a spark scala project

I am trying to use an LSH implementation of Scala(https://github.com/marufaytekin/lsh-spark) in my Spark project.I cloned the repository with some changes to the sbt file (added Organisation) sbt file of LSH Implementation

To use this implementation , I compiled it using sbt compile and moved the jar file to the "lib" folder of my project and updated the sbt configuration file of my project , which looks like this , sbt file of my main project

Now when I try to compile my project using sbt compile , It fails to load the external jar file ,showing the error message "unresolved dependency: com.lendap.spark.lsh.LSH#lsh-scala_2.10;0.0.1-SNAPSHOT: not found". Am i following the right steps for adding an external jar file ? How do i solve the dependency issue

Upvotes: 0

Views: 2737

Answers (2)

Dazzler
Dazzler

Reputation: 847

As an alternative, you can build the lsh-spark project and add the jar in your spark application. To add the external jars, addJar option can be used while executing spark application. Refer Running spark application on yarn

Upvotes: 1

Jonathan Taws
Jonathan Taws

Reputation: 1188

This issue isn't related to spark but to sbt configuration.

Make sure you followed the correct folder structure imposed by sbt and added your jar in the lib folder, as explained here - lib folder should be at the same level as build.sbt (cf. this post).

You might also want to check out this SO post.

Upvotes: 1

Related Questions