Reputation: 3868
Hi I am trying to run my spark service against cluster. As it turns out I have to do setJars and set my applicaiton jar in there. If I do it using physical path like following it works
conf.setJars(new String[]{"/path/to/jar/Sample.jar"});
but If i try to use JavaSparkContext (or SparkContext) api jarOfClass or jarOfObject it doesnt work. Basically API cant find jar itself.
Following returns empty
JavaSparkContext.jarOfObject(this);
JavaSparkContext.jarOfClass(this.getClass())
Its an excellent API only if it worked! Any one else able to make use of this?
Upvotes: 2
Views: 556
Reputation: 3835
[I have included example for Scala. I am sure it will work the same way for Java]
It will work if you do :
SparkContext.jarOfObject(this.getClass)
Surprisingly, this works for Scala Object as well as Scala Class.
Upvotes: 1
Reputation: 15539
How are you running the app? If you are running it from an IDE or compilation tool such as sbt, then
/path/to/jar/Sample.jar
exist, thus giving the hard coded path works, but this is not the .class that the jvm runing your app is using and it cannot find.Upvotes: 0