Reputation: 397
I have imported required packages. I am even able to import SparkBundleContext
import org.apache.spark.ml.bundle.SparkBundleContext
But then when I do
val sbc = SparkBundleContext()
I get this error
java.lang.NoClassDefFoundError: org/apache/spark/ml/clustering/GaussianMixtureModel
Upvotes: 0
Views: 304
Reputation: 23109
If you are using maven add the apache spark ML dependency as
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
</dependency>
If you are using SBT then add dependency as
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1"
Use the right version of dependency so that it matchs your scala version.
Hope this helps!
Upvotes: 0