Reputation: 391
I'm currently contributing to the OpenLineage Spark integration, a project entirely in Java and built with Gradle (8.4). It's a multi-module project with modules named app
, shared
, spark2
, spark3
, spark32
, spark33
, spark34
, and spark35
.
Problem:
I am attempting to build the shared
module for both Scala 2.12 and 2.13 variants of Apache Spark. All the modules (spark2
to spark35
) depend on shared
. My goal is to compile modules like spark35
to produce two builds: one for Scala 2.12 and another for Scala 2.13 variants of Apache Spark. This also requires the shared
module to be correctly built with the corresponding Spark variant to avoid runtime errors.
Approach Taken:
I've introduced source sets scala213
and testScala213
inside spark33
, spark34
, spark35
, and shared
, using the Java library plugin. These source sets consumed the same sources as the main
and test
source Dependencies are set default for Scala 2.12 variants of Apache Spark, and I have additional declarations for the Scala 2.13 source set. This includes a dependency on the shared
module.
Issue Encountered:
When I request the scala213RuntimeElements
(or even the scala213ApiElements
configuration) from the shared
project, the compile classpath does not include the compiled classes from the shared
module. However, reverting to the default configuration, these classes are present in the classpath.
Question:
Why does specifying a particular configuration (scala213RuntimeElements
) cause the compile classes of the shared
module to be missing from my compile classpath in my other modules? How can I resolve this issue to ensure the shared
module's classes are included in the classpath for both Scala 2.12 and 2.13 builds?
Below you will find my various build.gradle
files.
shared/build.gradle
sourceSets {
scala213 {
java.srcDir("src/main/java")
resources.srcDir("src/main/resources")
}
testScala213 {
compileClasspath += sourceSets.scala213.output
runtimeClasspath += sourceSets.scala213.output
java.srcDir("src/test/java")
resources.srcDir("src/test/resources")
}
}
configurations {
scala213Api
scala213ApiElements {
extendsFrom(scala213Api)
canBeResolved = false
canBeConsumed = true
}
scala213Implementation.extendsFrom(scala213Api)
scala213RuntimeElements {
extendsFrom(scala213Implementation, scala213RuntimeOnly)
canBeResolved = false
canBeConsumed = true
}
scala213CompileClasspath {
extendsFrom(scala213CompileOnly, scala213Implementation)
canBeResolved = true
}
scala213RuntimeClasspath {
extendsFrom(scala213Implementation, scala213RuntimeOnly)
canBeResolved = true
}
testScala213Implementation.extendsFrom(scala213Implementation)
testScala213RuntimeOnly.extendsFrom(scala213RuntimeOnly)
testScala213CompileClasspath {
extendsFrom(testScala213CompileOnly, testScala213Implementation)
canBeResolved = true
}
testScala213RuntimeClasspath {
extendsFrom(testScala213Implementation, testScala213RuntimeOnly)
canBeResolved = true
}
}
spark33/build.gradle:
sourceSets {
scala213 {
java.srcDir("src/main/java")
resources.srcDir("src/main/resources")
}
testScala213 {
compileClasspath += sourceSets.scala213.output
runtimeClasspath += sourceSets.scala213.output
java.srcDir("src/test/java")
resources.srcDir("src/test/resources")
}
}
dependencies {
implementation(project(path: ":shared"))
// others removed for brevity
scala213Implementation(project(path: ":shared", configuration: "scala213RuntimeElements"))
// others removed for brevity
}
Upvotes: 0
Views: 148
Reputation: 391
OK. I figured it out. It's a case of RTFM.
Specifically, this part of the manual. I was not adding the "variant" artifact to the shared
project. Ergo, Gradle didn't know what it was suppose to consume. Confusingly enough, the default configurations uses the classes, whereas variants require JARs.
Upvotes: 0