S. Gill
S. Gill

Reputation: 96

Why does %AddJar command in Bluemix give an error?

Sorry if this is a silly question. I have some simple code in a Scala Notebook in a Bluemix Spark instance. I try to add a jar from a github repository in the manner indicated in the tutorials (https://console.ng.bluemix.net/docs/services/AnalyticsforApacheSpark/index-gentopic2.html#developing_with_notebooks)

import scala.collection.breakOut
%AddJar https://github.com/IBM-Bluemix/cf-deployment-tracker-client-java/blob/master/dep-jar/com.ibm.json4j_1.0.9.jar

The Notebook output informs me that the download is finished but an exception is then thrown:

Starting download from https://github.com/IBM-Bluemix/cf-deployment-tracker-client-java/blob/master/dep-jar/com.ibm.json4j_1.0.9.jar
Finished download of com.ibm.json4j_1.0.9.jar
Out[30]:
Name: java.lang.NullPointerException
Message: null
StackTrace: scala.reflect.io.ZipArchive$.fromFile(ZipArchive.scala:36)
scala.reflect.io.ZipArchive$.fromFile(ZipArchive.scala:34)
scala.reflect.io.AbstractFile$.getDirectory(AbstractFile.scala:48)
scala.reflect.io.AbstractFile$.getDirectory(AbstractFile.scala:36)
scala.tools.nsc.Global.scala$tools$nsc$Global$$matchesCanonical$1(Global.scala:920)
scala.tools.nsc.Global$$anonfun$16.apply(Global.scala:924)
scala.tools.nsc.Global$$anonfun$16.apply(Global.scala:924)
scala.collection.Iterator$class.find(Iterator.scala:780)
scala.collection.AbstractIterator.find(Iterator.scala:1157)
scala.collection.IterableLike$class.find(IterableLike.scala:79)
scala.collection.AbstractIterable.find(Iterable.scala:54)
scala.tools.nsc.Global.scala$tools$nsc$Global$$assoc$1(Global.scala:924)
scala.tools.nsc.Global$$anonfun$17.apply(Global.scala:933)
scala.tools.nsc.Global$$anonfun$17.apply(Global.scala:933)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
scala.tools.nsc.Global.invalidateClassPathEntries(Global.scala:933)
com.ibm.spark.interpreter.ScalaInterpreter.updateCompilerClassPath(ScalaInterpreter.scala:167)
com.ibm.spark.interpreter.ScalaInterpreter.addJars(ScalaInterpreter.scala:90)
com.ibm.spark.magic.builtin.AddJar.execute(AddJar.scala:121)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
java.lang.reflect.Method.invoke(Method.java:507)
com.ibm.spark.utils.DynamicReflectionSupport.invokeMethod(DynamicReflectionSupport.scala:106)
com.ibm.spark.utils.DynamicReflectionSupport.applyDynamic(DynamicReflectionSupport.scala:78)
com.ibm.spark.magic.MagicExecutor.executeMagic(MagicExecutor.scala:32)
com.ibm.spark.magic.MagicExecutor.applyDynamic(MagicExecutor.scala:21)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
$line144.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
$line144.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
$line144.$read$$iwC$$iwC$$iwC.<init>(<console>:89)
$line144.$read$$iwC$$iwC.<init>(<console>:91)
$line144.$read$$iwC.<init>(<console>:93)
$line144.$read.<init>(<console>:95)
$line144.$read$.<init>(<console>:99)
$line144.$read$.<clinit>(<console>)
$line144.$eval$.<init>(<console>:7)
$line144.$eval$.<clinit>(<console>)
$line144.$eval.$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
java.lang.reflect.Method.invoke(Method.java:507)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:296)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:291)
com.ibm.spark.global.StreamState$.withStreams(StreamState.scala:80)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:290)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:290)
com.ibm.spark.utils.TaskManager$$anonfun$add$2$$anon$1.run(TaskManager.scala:123)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1153)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
java.lang.Thread.run(Thread.java:785)

I followed the tutorial pretty closely but it seems I have done something wrong. Any pointers would be much appreciated.

Thanks

Upvotes: 0

Views: 514

Answers (1)

Sven Hafeneger
Sven Hafeneger

Reputation: 801

You used the wrong URL to the jar, do not use blob. Try the raw path instead:

%AddJar https://github.com/IBM-Bluemix/cf-deployment-tracker-client-java/raw/master/dep-jar/com.ibm.json4j_1.0.9.jar

I have tried it, you can add the jar, but you may face an issue with an assertion.

In case it does not work (previously used wrong URL), restart your kernel or use -f flag:

%AddJar https://github.com/IBM-Bluemix/cf-deployment-tracker-client-java/raw/master/dep-jar/com.ibm.json4j_1.0.9.jar -f

Upvotes: 2

Related Questions