Reputation: 308
I am trying to build my Scala source code with SBT targeting same Scala version but different versions of libraries. My questions are:
Details: I am building for clusters with different versions of Spark and other libraries. This seems to be a common user case in such a scenario.
Possible solutions I have found:
Upvotes: 4
Views: 518
Reputation: 71
I solved it for build Spark PDF Data Source for different Apache Spark versions.
You can check examples here.
I also build with some custom code for each version of the Spark:
val sparkVersion = scala.util.Properties.envOrElse("SPARK_VERSION", "3.5.0")
val packageName =
sparkVersion match {
case sparkVersion if sparkVersion.startsWith("3.3") => "spark-pdf-spark33"
case sparkVersion if sparkVersion.startsWith("3.4") => "spark-pdf-spark34"
case _ => "spark-pdf-spark35"
}
lazy val common = (project in file("common"))
.settings(
name := "common",
commonSettings
)
.disablePlugins(AssemblyPlugin)
lazy val spark35 = (project in file("spark35"))
.settings(
name := "spark35",
commonSettings,
)
.disablePlugins(AssemblyPlugin)
.dependsOn(common)
lazy val spark34 = (project in file("spark34"))
.settings(
name := "spark34",
commonSettings,
)
.disablePlugins(AssemblyPlugin)
.dependsOn(common)
lazy val spark33 = (project in file("spark33"))
.settings(
name := "spark33",
commonSettings,
)
.disablePlugins(AssemblyPlugin)
.dependsOn(common)
lazy val root = (project in file("."))
.settings(
name := packageName,
commonSettings,
)
.dependsOn(dependencyModules():_*)
.aggregate(aggregatedModules(): _*)
Upvotes: 1
Reputation: 1404
No sbt does not support this, but if you want to do that then you can use shading. like if I want to use two different versions of guava that was supported by two different libraries then use guava shading like:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.guava**" -> "shadeio.@1").inAll
)
Upvotes: 0