Reputation: 23
I am trying to use ARIMA object (Scala), which is imported from a package, in my Java program. Although the compilation succeeds, meaning that ARIMA class is recognized during compilation, there is NoClassDefFoundError for the ARIMA object in runtime. ARIMAModel class has no problem with importing since it is a class.
Is there any way to use the Scala object from my Java program?
Here is the source code for the object in Scala package.
File: .../com/cloudera/sparkts/models/ARIMA.scala
package com.cloudera.sparkts.models
object ARIMA {
def autoFit(ts: Vector, maxP: Int = 5, maxD: Int = 2, maxQ: Int = 5): ARIMAModel = {
...
}
}
class ARIMAModel(...) {
...
}
Here is my Java code.
File: src/main/java/SingleSeriesARIMA.java
import com.cloudera.sparkts.models.ARIMA;
import com.cloudera.sparkts.models.ARIMAModel;
public class SingleSeriesARIMA {
public static void main(String[] args) {
...
ARIMAModel arimaModel = ARIMA.autoFit(tsVector, 1, 0, 1);
...
}
}
Here is the error.
Exception in thread "main" java.lang.NoClassDefFoundError: com/cloudera/sparkts/models/ARIMA
at SingleSeriesARIMA.main(SingleSeriesARIMA.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.sparkts.models.ARIMA
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I am using Scala version 2.11.8 and Java 1.8
Upvotes: 1
Views: 330
Reputation: 6323
You need to supply the dependency having Arima
object present to the spark cluster using --jars
option as below-
spark-submit --jars <path>/<to>/sparkts-0.4.1.jar --class SingleSeriesARIMA target/simple-project-1.0.jar
This will pass the other dependency along with the application jar to be available at spark-runtime
.
TO call ARIMA
object from java use-
ARIMA$.MODULE$.autoFit(tsVector, 1, 0, 1);
Upvotes: 2