Reputation: 1
I have getting this error when I execute the scala code using Jar file. Below versions used
scala version: 2.12 and
sbt: 1.7.1
spark: 3.1.2.
java - jdk 1.8.0_331
Error: NoClassDefFoundError: scala/Product$class Caused by: ClassNotFoundException: scala.Product$class
dependent library used in sbt.build:
"org.apache.spark" %% "spark-core" % "3.1.2" %"provided",
"org.apache.spark" %% "spark-sql" % "3.1.2" %"provided",
"commons-validator" % "commons-validator" % "1.6",
"org.elasticsearch" %% "elasticsearch-spark-20" % "7.14.1",
"com.github.pureconfig" %% "pureconfig" % "0.12.3",
"org.rogach" %% "scallop" % "3.5.0",
"com.github.dwickern" %% "scala-nameof" % "1.0.3" ,
"com.sksamuel.elastic4s" %% "elastic4s-core" % "7.14.1",
"com.sksamuel.elastic4s" %% "elastic4s-client-esjava" % "7.14.1",
"com.sksamuel.elastic4s" %% "elastic4s-http-streams" % "7.14.1",
"com.sksamuel.elastic4s" %% "elastic4s-testkit" % "7.14.1" % "test",
"com.databricks" % "dbutils-api_2.11" % "0.0.4"
Tried multiple ways do not get the solution yet. any help much apreciated.
Thanks
Upvotes: 0
Views: 332
Reputation: 15086
dbutils-api_2.11
is compiled for Scala 2.11, and is not compatible with Scala 2.12. You should use dbutils-api_2.12
if that exists, or otherwise just not use that library.
For Scala libraries you should use the %%
syntax to avoid these kinds of issues:
"com.databricks" %% "dbutils-api" % "0.0.4"
Now the correct suffix will be added automatically by sbt.
On mvnrepository you can see which versions are available.
Upvotes: 1